cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Highlighted
Explorer
Explorer
240 Views
Registered: ‎05-03-2018

DNNC support for maximum number of channels

Hello,

        When I use dnnc to compile the model,Encountered some problems ,In order to solve my problem, I added the network layer by layer, but finally reported an error at a certain layer. The network of this layer is as follows, with 1152 output channels, and the kernel size is 5. I would like to ask whether DNNDK has Limitation of the maximum output channel.


layer {
name: "ConvLayer_56"
type: "Convolution"
bottom: "Layer_55"
top: "Layer_56"
convolution_param {
num_output: 1152
bias_term: false
pad: 2
kernel_size: 5
group: 1152
stride: 1
dilation: 1
}

0 Kudos
2 Replies
Highlighted
Xilinx Employee
Xilinx Employee
234 Views
Registered: ‎07-16-2008

You may want to take a look at the features and parameters supported by DPU IP.

https://www.xilinx.com/support/documentation/ip_documentation/dpu/v3_1/pg338-dpu.pdf

pg21, Table 7

-------------------------------------------------------------------------
Don't forget to reply, kudo, and accept as solution.
-------------------------------------------------------------------------
0 Kudos
Highlighted
Explorer
Explorer
225 Views
Registered: ‎05-03-2018

I have seen this before. The support is written on the PDF, but when I have 1152 output channels for two consecutive layers, dnnc will report an error, so I do n’t know if there is a limit on the combination of kernel_size and num_output.

Thank you for your help. When I add the 48th floor, an error occurs. Thank you very much.


layer {
name: "ConvLayer_47"
type: "Convolution"
bottom: "Layer_46"
top: "Layer_47"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 1
decay_mult: 0
}
phase: TRAIN
convolution_param {
num_output: 1152
bias_term: true
pad: 0
kernel_size: 1
group: 1
stride: 1
dilation: 1
}
}
layer {
name: "RELULayer_47"
type: "ReLU"
bottom: "Layer_47"
top: "Layer_47"
phase: TRAIN
}
layer {
name: "ConvLayer_48"
type: "Convolution"
bottom: "Layer_47"
top: "Layer_48"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 1
decay_mult: 0
}
phase: TRAIN
convolution_param {
num_output: 1152
bias_term: true
pad: 2
kernel_size: 5
group: 1152
stride: 1
dilation: 1
}
}

0 Kudos