cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Highlighted
Visitor
Visitor
362 Views
Registered: ‎09-13-2018

VIVADO simulation Fatal Error when simulating large datasets

Hi Forum,

I have a simulation where I load a large dataset into a verilog memory array (i.e. reg [n:0] memory_array [0:m]) where n is >200,000. I do not need to log this data, I'm just using it to simulate a RAM. When I simulate with a smaller n, simulation finishes normally. However, when n gets large, Vivado's memory usage climbs to around 59GB and then I see the following error:

FATAL_ERROR: Vivado Simulator kernel has discovered an exceptional condition from which it cannot recover. Process will terminate.

I'm simulating on a machine that has 96GB of RAM, so I'm fairly certain this issue is with Vivado/xsim itself.

Is there some maximum amount of data I can store in a verilog array? Should I split the array up innto different variables? Worst-case scenario I can batch load small regions of the memory space, but I would rather keep my original simulation methodology.

Any ideas?

-Jack

0 Kudos
2 Replies
Highlighted
Visitor
Visitor
354 Views
Registered: ‎09-13-2018

This is the version of Vivado I'm using: Vivado v2018.2_AR71275_op (64-bit)

0 Kudos
Highlighted
Xilinx Employee
Xilinx Employee
328 Views
Registered: ‎07-16-2008

If you still see the fatal error using 2018.3, please provide a test case and we can file a CR to have it improved.

-------------------------------------------------------------------------
Don't forget to reply, kudo, and accept as solution.
-------------------------------------------------------------------------
0 Kudos