09-11-2020 07:33 PM
I am trying to map the ip2intc_irpt pin on an AXI EthernetLite IP to an external pin so I can use it as a pulse to capture timing information on an o-scope. The evaluation board I am using is Digilent Arty-7-35T, and I am connecting the interrupt line to pin D13 (Connector JA, Pin 7).
The IO mapping is shown in the the Synthesis and Implementation, I can see the interrupt in the xparameters.h file. I have tailored the lwip template project to reach this state. In the Microblaze code I enable the Global Interrupt Enable bit as well as the Receive and Transmit Interrupt enable bits in their respective registers. I can set up a TCP/IP session via Teraterm and see the project execute as I expect. The only exception to proper behavior is seeing pin D13 move from 0V on the o-scope.
Attached is a screenshot of the block diagram showing the methodology described. Any ideas?
09-14-2020 08:50 AM
Hi @douglas.yamamoto ,
I am not sure if I understand the issue - "The only exception to proper behavior is seeing pin D13 move from 0V on the o-scope. "?
Do you mean when pin D13 moves above 0V, the application does not behave/run as expected?
What is the unexpected behavior? Have you been able to trace it down through SDK?
09-14-2020 09:16 AM
To clarify, D13 does not move from 0V which make it appear as though an interrupt is never getting triggered. I have yet to set any breakpoints in SDK, are there any suggestions for a good place in the code?