Skip to content

兼容性调整:paddlepaddle-gpu==3.0.0rc1导出推理模型为.json格式,早期版本为.pdmodel,.pdipara… #14862

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

akrilio
Copy link

@akrilio akrilio commented Mar 15, 2025

问题:使用paddlepaddle-gpu==3.0.0rc1导出推理模型后不能正常读取。
原因:paddlepaddle-gpu==3.0.0rc1导出推理模型为.json格式,当前版本仅指定了.pdmodel格式。
修改:增加了对文件后缀的判定。

Copy link

paddle-bot bot commented Mar 15, 2025

Thanks for your contribution!

@CLAassistant
Copy link

CLAassistant commented Mar 15, 2025

CLA assistant check
All committers have signed the CLA.

@SWHL SWHL added this to the v3.0.0 milestone Apr 2, 2025
@@ -225,11 +225,13 @@ def create_predictor(args, mode, logger):

else:
file_names = ["model", "inference"]
model_formats = ["pdmodel", "json"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

可以考虑优先去找.json格式模型,把json放前面?

for model_format in model_formats:
model_file_path = "{}/{}.{}".format(model_dir, file_name, model_format)
if os.path.exists(model_file_path) and os.path.exists(params_file_path):
break
if not os.path.exists(model_file_path):
raise ValueError(
"not find model.pdmodel or inference.pdmodel in {}".format(model_dir)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

上面增加了找json格式模型的逻辑,这里的Error信息也得跟着变下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants