|
|
本帖最后由 Jerry_Zh 于 2026-1-19 22:00 编辑
Yolov5-CLS模型,推理发现准确率不佳,RKNN转换后的.rknn模型貌似在某些层和原.onnx模型相比精度损失极大,使用accuracy_**ysis()API进行分析,结果如下所示,中间累计和逐层精度损失非常严重。我将这段子图拆分出来发现RKNN转换前后精度差异极大,几乎不可用,rknntoolkit2版本为2.3.0,这么大的精度损失头一回见,相关的模型和输入样例运行脚本都打包好了可以直接解压复现(百度网盘,附件不知为什么一直上传失败),想问问大佬们这个根本问题是啥(我猜想和FP32的模型强制进行RKNN的FP16精度推理有关?但是由于某些原因我只能使用FP32的ONNX模型,没法重新FP16训练),以及能否提供一下解决的思路?
通过网盘分享的文件:复现.zip
链接: https://pan.baidu.com/s/1JsEZGigWSIGCn3iZ4Gz9Gg?pwd=9v7u 提取码: 9v7u
--来自百度网盘超级会员v6的分享
# simulator_error: calculate the output error of each layer of the simulator (compared to the 'golden' value).
# entire: output error of each layer between 'golden' and 'simulator', these errors will accumulate layer by layer.
# single: single-layer output error between 'golden' and 'simulator', can better reflect the single-layer accuracy of the simulator.
# simulator_error: calculate the output error of each layer of the simulator (compared to the 'golden' value).
# entire: output error of each layer between 'golden' and 'simulator', these errors will accumulate layer by layer.
# single: single-layer output error between 'golden' and 'simulator', can better reflect the single-layer accuracy of the simulator.
layer_name simulator_error
entire single
cos euc cos euc
--------------------------------------------------------------------------
[Input] in_T 1.00000 | 0.0 1.00000 | 0.0
[Conv] 1262 1.00000 | 0.6305 1.00000 | 0.6305
[MaxPool] 718 1.00000 | 0.3165 1.00000 | 0.2877
[Conv] 1265 1.00000 | 0.5424 1.00000 | 0.5348
[Relu] 721 1.00000 | 0.3922 1.00000 | 0.2553
[Conv] 1268 1.00000 | 0.7729 1.00000 | 0.6438
[Add] 724 1.00000 | 0.9396 1.00000 | 0.6215
[Relu] 725 1.00000 | 0.6412 1.00000 | 0.2647
[Conv] 1271 1.00000 | 0.7033 1.00000 | 0.4283
[Relu] 728 1.00000 | 0.5520 1.00000 | 0.2024
[Conv] 1274 1.00000 | 0.8934 1.00000 | 0.4857
[Add] 731 1.00000 | 1.1750 1.00000 | 0.5339
[Relu] 732 1.00000 | 1.0248 1.00000 | 0.3713
[Conv] 1277 1.00000 | 0.5914 1.00000 | 0.3055
[Relu] 735 1.00000 | 0.4183 1.00000 | 0.1340
[Conv] 1280 1.00000 | 0.8678 1.00000 | 0.3772
[Conv] 1283 1.00000 | 0.6453 1.00000 | 0.3589
[Add] 740 1.00000 | 1.1834 1.00000 | 0.4906
[Relu] 741 1.00000 | 0.8315 1.00000 | 0.2428
[Conv] 1286 1.00000 | 1.4039 1.00000 | 0.4601
[Relu] 744 1.00000 | 1.1013 1.00000 | 0.2825
[Conv] 1289 1.00000 | 2.9434 1.00000 | 0.7785
[Add] 747 1.00000 | 3.1330 1.00000 | 0.8655
[Relu] 748 1.00000 | 2.4349 1.00000 | 0.5534
[Conv] 1292 1.00000 | 2.0567 1.00000 | 0.5141
[Relu] 751 1.00000 | 1.6483 1.00000 | 0.3348
[Conv] 1295 1.00000 | 5.3113 1.00000 | 1.1377
[Conv] 1298 1.00000 | 1.4107 1.00000 | 0.5453
[Add] 756 1.00000 | 5.4617 1.00000 | 1.5530
[Relu] 757 1.00000 | 4.1399 1.00000 | 0.7690
[Conv] 1301 1.00000 | 10.871 1.00000 | 2.1261
[Relu] 760 1.00000 | 9.0029 1.00000 | 1.6912
[Conv] 1304 1.00000 | 27.359 1.00000 | 5.0279
[Add] 763 1.00000 | 27.965 1.00000 | 5.9262
[Relu] 764 1.00000 | 22.507 1.00000 | 3.8012
[Conv] 1307 1.00000 | 28.170 1.00000 | 5.3373
[Relu] 767 1.00000 | 17.673 1.00000 | 3.1647
[Conv] 1310 1.00000 | 70.582 1.00000 | 12.584
[Conv] 1313 1.00000 | 25.693 1.00000 | 5.0051
[Add] 772 1.00000 | 66.971 1.00000 | 17.407
[Relu] 773 1.00000 | 51.218 1.00000 | 8.6798
[Conv] 1316 1.00000 | 173.99 1.00000 | 29.310
[Relu] 776 1.00000 | 113.17 1.00000 | 18.765
[Conv] 1319 1.00000 | 383.62 1.00000 | 64.559
[Add] 779 1.00000 | 369.57 1.00000 | 76.090
[Relu] 780 1.00000 | 168.27 1.00000 | 27.212
[Conv] 1322 1.00000 | 305.75 1.00000 | 57.647
[Relu] 783 1.00000 | 133.00 1.00000 | 22.520
[Conv] 1325 1.00000 | 377.36 1.00000 | 70.812
[Add] 786 1.00000 | 396.41 1.00000 | 78.586
[Relu] 787 1.00000 | 320.27 1.00000 | 54.651
[Conv] 1328 1.00000 | 475.20 1.00000 | 89.755
[Relu] 790 1.00000 | 157.06 1.00000 | 26.578
[Conv] 1331 1.00000 | 603.81 1.00000 | 105.92
[Add] 793 1.00000 | 854.62 1.00000 | 186.99
[Relu] 794 1.00000 | 829.91 1.00000 | 136.63
[ConvTranspose] 813 1.00000 | 1591.6 1.00000 | 316.72
[Conv] 1334 1.00000 | 1496.8 1.00000 | 267.63
[Relu] 816 1.00000 | 939.26 1.00000 | 159.01
[Conv] 1337 1.00000 | 1993.3 1.00000 | 357.72
[Conv] 1340 1.00000 | 1921.6 1.00000 | 341.52
[Add] 821 1.00000 | 1584.0 1.00000 | 554.11
[Relu] 822 1.00000 | 239.46 1.00000 | 15.090
[Conv] 1343 1.00000 | 609.60 1.00000 | 77.446
[Relu] 825 1.00000 | 436.50 1.00000 | 50.789
[Conv] 1346 1.00000 | 668.69 1.00000 | 115.70
[Add] 828 1.00000 | 678.95 1.00000 | 114.70
[Relu] 829 0.99999 | 73.478 1.00000 | 4.0148
[ConvTranspose] 848 1.00000 | 99.663 1.00000 | 8.4001
[Conv] 1349 1.00000 | 136.19 1.00000 | 17.793
[Relu] 851 1.00000 | 113.90 1.00000 | 12.361
[Conv] 1352 1.00000 | 128.79 1.00000 | 20.720
[Conv] 1355 1.00000 | 108.39 1.00000 | 17.528
[Add] 856 1.00000 | 127.28 1.00000 | 28.304
[Relu] 857 1.00000 | 70.737 1.00000 | 7.3652
[Conv] 1358 1.00000 | 137.09 1.00000 | 22.909
[Relu] 860 1.00000 | 107.22 1.00000 | 12.974
[Conv] 1361 1.00000 | 139.40 1.00000 | 23.581
[Add] 863 1.00000 | 124.91 1.00000 | 22.415
[Relu] 864 1.00000 | 24.902 1.00000 | 1.6574
[Mul] 920_sw 1.00000 | 0.1945 1.00000 | 0.0129
[Transpose] 932_sw 1.00000 | 0.1945 1.00000 | 0.0129
[Reshape] 932_scaled_128_rs#1 1.00000 | 0.1945 1.00000 | 0.0129
[Reshape] 932_scaled_128_rs 1.00000 | 0.1945 1.00000 | 0.0129
[Mul] 936-rs 1.00000 | 1.0014 1.00000 | 0.1286
[Transpose] 936_rs 1.00000 | 1.0014 1.00000 | 0.0693
[Conv] 936_rs_mm 1.00000 | 2.2056 1.00000 | 0.1559
[Transpose] 937_rs 1.00000 | 2.2056 1.00000 | 0.1558
[Sqrt] 938-rs 1.00000 | 0.0975 1.00000 | 0.0138
[Expand] 938_rs_expand 1.00000 | 1.1036 1.00000 | 0.1527
[Div] 940-rs 0.99998 | 0.1809 1.00000 | 0.0100
[Transpose] 941-rs 0.99998 | 0.1809 1.00000 | 0.0057
[Reshape] 941 0.99998 | 0.1809 1.00000 | 0.0057
I The error **ysis results save to: ./ana_fp16_scale_128/error_**ysis.txt
|
|