UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Visitor maanrl
Visitor
2,826 Views
Registered: ‎06-14-2011

AXI-DDR3 controller on Virtex-6 simulation problem

Hi everyone,

 

I am trying to perform a simulation of the Xilinx DDR3-AXI controller together with the core I am working on. My core implements the AXI master interface. I generated a basic "processorless" system as described in AR#37856 and added my core and Micron's model of the memory (SODIMM used in ML605 board), so basically what I have is:

 

DDR Model <---> DDR Controller <---> AXI Interconnect <---> My core

 

I am configuring the controller as explained in the ML605 MIG Design Creation (xtp047) document. As well I used the ARM assertion package between my core and AXI Interconnect to verify I am "speaking" AXI protocol properly. The tools I am using are:

 

ISE and XPS 12.4

Modelsim 6.5c

Micron DDR3 model versions 1.60 (used by CoreGen example) and 1.67 (most recent from Micron's website)

 

The simulation consist of sending several bursts to write some data in the DDR, the problem I am having is that the simulation always ends with the following message:

 

# axisubs_tb.u4_ddr.u0.cmd_task: at time 83591 ns ERROR: ODT must be off prior to Load Mode
# axisubs_tb.u4_ddr.u0.cmd_task: at time 83591 ns ERROR: Load Mode Failure.  All banks must be Precharged.
# Break in Task cmd_task at ../../../../rtl/DDR3_V1.60/ddr3_model.v line 973

 

The only thing that I have noticed is that the ddr cs_n signal turns to a "Unknown" state as you can see from the picture. I tried this experiment with different burst lengths but the only difference I can see is the time it takes to the simulation to abort. If I send a single 256-beat burst the simulation finishes faster (even before the burst is completed) than if I send several 10-beat bursts. I think the problem is not on the AXI side of the controller because I can't find any inconsistency on the signals, all channels Write Address, Write Data, and Write Response seem to work properly. Xilinx documentation says that the AXI-based cores should handle burst sizes up to 256 beats for INCR bursts.

 

Does anyone know where should I look for the problem?

 

long_simulation.bmp
0 Kudos