-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
针对rv1126芯片量化mobilenet-0.25 ,得到的quant_model.onnx,使用rknntoolkit-1.7.1加载失败 #17
Comments
使用onnxruntime量化后的模型进行部署不确定性较大,推荐rv hybrid模式,使用dipoorlet生成的量化参数覆写rk工具生成的json |
只需要修改量化参数,注意不要替换全部内容,需要根据名字匹配修改min/max/scale/zp,可以参考https://github.com/ModelTC/Dipoorlet/blob/main/example/rv.md |
@Tracin 嗯嗯,灰常谢谢,我后边试试。 |
@Tracin 尝试了一下,目前rknntoolkit-1.7.1 hybrid模式step1生成的节点名称,会在最前边加上节点类型, 所以用https://github.com/ModelTC/Dipoorlet/blob/main/example/rv.md 这个脚本有点问题 |
@Tracin 修改代码适配新的节点名字之后,执行hybrid_step1 和 hybrid_step2,并且执行精度分析看着精度不太好,
下面是rknn的精度模拟 尝试一:
尝试二:
|
@zjd1988 |
@gushiqiao 好的,我后边尝试一下,谢谢。 |
RV分析结果还是很奇怪的,非对称u8量化模式下RV与DP模拟差距有些大 |
按照example里关于rv平台的量化示例,对mobilenet模型进行量化,能正常的到量化后的onnx模型,但是用rknn-toolkit转换失败


看报错信息,应该是官方不支持gemm量化后的算子,我查看了官方rknntoolkit仓库里关于加载量化模型的示例,发现瑞芯微官方提供的shufflenet模型最后的gemm前后确实也没有加quant/dequant op
https://github.com/rockchip-linux/rknn-toolkit/tree/master/examples/common_function_demos/load_quantized_model/onnx
quant_model.zip
mobilenet.zip
The text was updated successfully, but these errors were encountered: