11-02-2020 08:35 AM
Here is a very simplified version of an algorithm which calculates the values of the columns of an image from a polynomial. We can clearly see in the signals that the ping pong buffering is performed correctly, however there is a huge delay between each burst? The delays are so great that it hardly has any advantage of using dataflow with ping pong buffering. Is there a way to reduce delays or could there be a better method?
11-05-2020 11:58 PM
@passpass Can you elaborate what you mean by huge delay.
If you remove dataflow, you would see a higher latency and II values. what is your expectation.
For the code you have, II for the loops in read and compute functions are met and dataflow is applied for the functions.
If you could elaborate on what you are exactly looking for, i can comment on if there's a way
11-09-2020 11:03 AM
Sorry for the lack of explanation. Yes, I know the dataflow is applied, however there is a very big delay between each burst. I have a feeling that nothing is happening and the algorithm should run much faster. If you look at the picture you can see that no processing is done in the red areas.