|
E load_tflite: Maybe the tensorflow version is not consistent with your model, default tensorflow version: 1.x !
W load_tflite: ===================== WARN(2) =====================
E rknn-toolkit2 version: 1.3.0-11912b58
E load_tflite: Catch exception when loading tflite model: /home/t/rknn/rknn-toolkit2/examples/tflite/mediapipe_hand/hand_landmark_lite.tflite!
E load_tflite: Traceback (most recent call last):
E load_tflite: File "rknn/api/rknn_base.py", line 1447, in rknn.api.rknn_base.RKNNBase.load_tflite
E load_tflite: File "/home/t/.local/lib/python3.6/site-packages/tensorflow/lite/python/interpreter.py", line 95, in allocate_tensors
E load_tflite: return self._interpreter.AllocateTensors()
E load_tflite: File "/home/t/.local/lib/python3.6/site-packages/tensorflow/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 106, in AllocateTensors
E load_tflite: return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
E load_tflite: RuntimeError: tensorflow/lite/kernels/dequantize.cc:62 op_context.input->type == kTfLiteUInt8 || op_context.input->type == kTfLiteInt8 was not true.Node number 0 (DEQUANTIZE) failed to prepare.
E load_tflite: During handling of the above exception, another exception occurred:
E load_tflite: Traceback (most recent call last):
E load_tflite: File "rknn/api/rknn_base.py", line 1458, in rknn.api.rknn_base.RKNNBase.load_tflite
E load_tflite: File "rknn/api/rknn_log.py", line 113, in rknn.api.rknn_log.RKNNLog.e
E load_tflite: ValueError: Maybe the tensorflow version is not consistent with your model, default tensorflow version: 1.x !
|
|