Showing results for 
Show  only  | Search instead for 
Did you mean: 
Registered: ‎07-01-2019

[DNNDK] Parameter #param is invalid for function dpuRunSoftmax


I met a dnndk error when I using XILIN AI SDK.

I trained a customized mobilenet-ssd model, generated the so, and using the test_customer_provided_model_ssd.cpp as basis for testing.

In the test_customer_provided_model_ssd.cpp, I only change the create_ex("${MODEL_NAME}, true), using new model_name instead of the demo one.

It reports "[DNNDK] Parameter #param is invalid for function dpuRunSoftmax" when the bin is excuted.

which param is invalid in dpuRunSoftmax and how to correct it, please?


Here are some of my operations in the whole process:

1) In the trained prototxt:batch_size=1, class=21

2) In the layer "mbox_conf_reshape","mbox_conf_softmax","mbox_conf_flatten", there isn't include { phase:} description. In the layer "detection_out", there is "include {phase:TEST}"

  I add "phase: TEST" in reshape,softmax and flatten layer(otherwise, it would report reshape is not support in quantization.)

3) In the deploy.prototxt after quantization, the reshape, softmax, flatten and detection layer is disappeared

4) the deploy.prototxt and the caffemodel have been used to generated elf .with decent-cpu running, "[DNNC][Warning] Only 'channel' axis supported on DPU for Concat, current layer is [mbox_priorbox]." has been reported, I ignored this warnning.

6) on the board, I set num_classese: 21 in /etc/XILINX_AI_SDK.conf.d/ssdmobilenet.prototxt.



0 Kudos
0 Replies