11-01-2017 07:03 AM
Sorry in advance for the details, I tried to make it as synthetic as possible. Thanks in advance for your time.
Sequence in "Test mode 1":
In "test mode 1", PC waits each reply (or until a timeout occurs if no reply) before sending the next UDP frame.
Sequence in "Test mode 2":
In "test mode 2" a massive loss (between 1% and 60%) of packet occurs. This problem is more obvious for small frames (ex: 14 bytes) for wich the loss is between 30% and 60%. For big frame (1400 bytes) the ratio of packet loss is around 1%.
Note: The python scripts mesures the number of frame sent/received. And I also monitore network trafic using windows Task Manager (where, in this case, I see big differences between rates PC->KC705 and KC705->PC).
This is where I need some help :)
Thanks much for help
11-17-2017 08:04 AM
Still working on this?
With "some MAC code", are you saying that your MAC code is home-spun--or are you using a Xilinx IP? The MAC IP should have come with constraints for the IO, based on your platform.
Are there any stats being acquired by the FPGA MAC?
02-02-2021 06:25 AM
I am also working on such a project recently, UDP communication If it is convenient, can you refer to your program? Mail email@example.com Thank you very much
02-02-2021 06:45 AM
Our Community Help has a tip that might help you : Tip: If the message is older than 6-12 months, please post a new message rather than adding to the existing thread. Your inquiry will have a better chance of being picked up by an expert if it is a new topic.
I would suggest you create a new topic on the appropriate board.