I got following error after i ran DECENT_Q.
ValueError: NodeDef mentions attr 'explicit_paddings' not in Op<name=Conv2D; signature=input:T, filter:T -> output:T; attr=T:type,allowed=[DT_HALF, DT_BFLOAT16, DT_FLOAT, DT_DOUBLE]; attr=strides:list(int); attr=use_cudnn_on_gpu:bool,default=true; attr=padding:string,allowed=["SAME", "VALID"]; attr=data_format:string,default="NHWC",allowed=["NHWC", "NCHW"]; attr=dilations:list(int),default=[1, 1, 1, 1]>; NodeDef: conv2d_1/convolution = Conv2D[T=DT_FLOAT, data_format="NHWC", dilations=[1, 1, 1, 1], explicit_paddings=[], padding="SAME", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true](input_1, conv2d_1/convolution/ReadVariableOp). (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.).
decent.sh: 27: decent.sh: --gpu: not found
Following is my decent script
conda activate decent_q3
# run quantization
echo "#####################################"
echo "QUANTIZE"
echo "#####################################"
decent_q quantize \
--input_frozen_graph ./noise_filter_model/Unet.pb \
--input_nodes input_1 \
--input_shapes ?,1024,1024,6 \
--output_nodes conv2d_23/Sigmoid \
--method 1 \
--input_fn default \
#--calib_iter 100 \
#--batch_size 50 \
#--image_dir ./calib_dir \
#--image_list ./calib_dir/calib_list.txt \
#--scales 0.00392,0.00392,0.00392 \
--gpu 0
echo "#####################################"
echo "QUANTIZATION COMPLETED"
echo "#####################################"
I've ran the decent_q inspect command, following is the result.
Op types used: 431 Const, 136 Identity, 23 BiasAdd, 23 Conv2D, 22 FusedBatchNorm, 22 Relu, 4 ConcatV2, 4 MaxPool, 4 Mul, 4 ResizeNearestNeighbor, 4 Shape, 4 StridedSlice, 1 Placeholder, 1 Sigmoid
Found 1 possible inputs: (name=input_1, type=float(1), shape=[?,1024,1024,6])
Found 1 possible outputs: (name=conv2d_23/Sigmoid, op=Sigmoid)
What could be the problem? I am not sure about the input_nodes and output_nodes in the decent script. I only entered the value from decent inspect result.
Any help is appreciated!