02-28-2021 07:32 AM - edited 02-28-2021 07:37 AM
Hi,
I am starting a project in which I want to implement the DeepSpeech 2 neural network on ZCU104 using Vitis AI Runtime (VART) flow.
I am planing to use the pytorch implementation of this network here and after reading the model structure I've noticed that an important part of this network is a RNN (simple RNN or GRU). But there is a part in the Vitis AI User Guide documentation page 67 saying "The model to be quantized should include forward method only.".
So I wanted to know if that means I cannot quantize DeepSpeech 2 using VAI quantizer. If yes, what possible solutions do you have for this issue? And if not, what does this limitation refer to?
Also: I'm very new to Vitis AI and could really appreciate any suggestion / recommendations through any step of VART workflow.
Many Thanks,
Ali
02-28-2021 07:01 PM
Please refer to RNN quantizer in this link,
https://github.com/Xilinx/Vitis-AI/tree/master/tools/RNN
Hope it helps.
----------------------------------------------------------------------------------------------
Don't forget to "Accept as solution" or "Kudo" if it helps. Thanks!
----------------------------------------------------------------------------------------------
03-01-2021 02:37 AM
Thank you for your reply,
I've already read this documentation but I didn't find any suggestions for RNN implementation on MPSoCs. So I just wanted to make sure if there is no way of efficiently implementing RNNs on DPUv2 at the moment, so I can decided whether or not I need to change my target hardware.
Thanks,
Ali
03-01-2021 06:11 PM
RNN tool is in rapidly designing. But so far, from the doc we can know that this tool chain only supports standard LSTM, and the generated instructions can only be deployed in U25 card and U50 card.
So I guess it's not supported on MPSoC yet. Anyway I will check the timeline and get back.
----------------------------------------------------------------------------------------------
Don't forget to "Accept as solution" or "Kudo" if it helps. Thanks!
----------------------------------------------------------------------------------------------