UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Adventurer
Adventurer
504 Views
Registered: ‎01-19-2018

xsimk makes my workstation run out of memory

Vivado- 2018.2

OS- Win7 Pro, SP1, 64bit

Processor- i7-4770 CPU @ 3.40GHz

RAM- 16.0GB

 

I kept a simulation running overnight and this is what I observed in the morning.

 

Viv2018.2_xsim_stop.png

 

Related thread: https://forums.xilinx.com/t5/Simulation-and-Verification/Any-running-hours-limitation-for-xsim-in-Vivado-2018-2/td-p/903140

 

Is it a bug? What is the solution for long term running xsimk?

 

0 Kudos
19 Replies
Scholar drjohnsmith
Scholar
484 Views
Registered: ‎07-09-2009

Re: xsimk makes my workstation run out of memory

Is it a bug

 

yes

 

is it xilinx , I'd say no.

 

I have simulations of big designs that run on Xeon machines, and take 3 to 5 days to run, 

      BUT , they dont run windows 7, but Linux,

           they do not run other programs like anti virus, 

                and they have 64 G of memory

 

remember . a long time before you run out of memory , the machine is thrashing the hard drive , and very slow.

 

I'm afraid the most likely problem you have is your IT department,  and the fact your running multiple dis connected programs on the one machine, 

 

 

 

Adventurer
Adventurer
480 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

@drjohnsmith,

Thanks.

 

Would anyone from Xilinx care to take up this issue?

0 Kudos
Scholar richardhead
Scholar
464 Views
Registered: ‎08-01-2012

Re: xsimk makes my workstation run out of memory

Are you sure its not a problem in the testbench itself?

Are you using Access types in the VHDL? are you deallocating them when you've finished with them, or just calling new on an old objects? Do you have any functions that do:

 

return var.all;

 

When using access types, it can be easy to create memory leaks as VHDL has no garbage collection. So returns like the above will leave the pointer in the function still alive even though you have no way to access it any more.

Moderator
Moderator
450 Views
Registered: ‎09-15-2016

Re: xsimk makes my workstation run out of memory

Hi @gin_xil,

 

Can you please share the archived project to check at our end. Please let me know if you want me to send you an FTP link to share the project.

 

 

Thanks & Regards,
Sravanthi B
----------------------------------------------------------------------------------------------
Kindly note- Please mark the Answer as "Accept as solution" if information provided is helpful.

Give Kudos to a post which you think is helpful and reply oriented.
----------------------------------------------------------------------------------------------
0 Kudos
Adventurer
Adventurer
435 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

@richardhead,

 

Nothing like that.

0 Kudos
Adventurer
Adventurer
431 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

@bandi,

 

Not possible to share the project.

 

Any more info?

 

0 Kudos
Scholar richardhead
Scholar
427 Views
Registered: ‎08-01-2012

Re: xsimk makes my workstation run out of memory

@gin_xil

 

Without sharing the project, or at least an example that also exhibits the behaviour, it is we can only guess as to what is happening.

The only guess is you are logging a lot of signals during the simulation.

0 Kudos
Adventurer
Adventurer
425 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

@richardhead,

 

Without sharing the project, or at least an example that also exhibits the behaviour, it is we can only guess as to what is happening.

Don't you work on projects with NDA agreements?

You can ask for other info which is possible/realistic to provide! I will answer them happily.

0 Kudos
Scholar richardhead
Scholar
414 Views
Registered: ‎08-01-2012

Re: xsimk makes my workstation run out of memory

@gin_xil

 

Yes I do. But I also understand that without direct information or an example that shows the project, it can be difficult to diagnose the issue.  As others have pointed out, they have no issue with large or long simulations. 

 

So on to some probing about the test:

How is the data generated in the testbench?

How is the data driven in the testbench?

I notice you're dealing with IPv6 packets - how do you control the variable length on data generation?

Are you logging all signals from every entity, or just the DUT? Is this really necessary?

 

 

0 Kudos
Scholar drjohnsmith
Scholar
423 Views
Registered: ‎07-09-2009

Re: xsimk makes my workstation run out of memory

is this thread related to the other thread running in the forums about running out of disc space ?

 

0 Kudos
Adventurer
Adventurer
408 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

@drjohnsmith,

 

Yes, please see my 1st post! have already mentioned that.

0 Kudos
Adventurer
Adventurer
403 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

@richardhead,

 

How is the data generated in the testbench? How is the data driven in the testbench?

I have a clock and fixed sized arrays which are driving rmii o/p ports. There is an iteration count which determines the num of frames generated.

Note that the DUT has 36 ports so in the TB, 36 iterations of the rmii frame generator is running. Each generator is pushing about 100 frames.

 

I notice you're dealing with IPv6 packets - how do you control the variable length on data generation?

It is easy to generate in a for loop. Using a generic parameter making sure that the 46 >= frame_length <= 1500

 

Are you logging all signals from every entity, or just the DUT? Is this really necessary?

I am logging a lot of signals (which are not all the signals of the DUT), because I should be able to see where a failure occurs.

 

As mentioned before this is a huge design which has 36 Ethernet data logging ports, these frames are concentrated together, buffered in a 2 GB on-board RAM (MIG core used) and finally push out of the DUT using a Gigabit RGMII i/f. There are many stuffs happening to these frames inside this complete data path, but it is not necessary to divulge the details about them.

I am almost certain my DUT or TB has nothing to do with this failure.

0 Kudos
Scholar richardhead
Scholar
397 Views
Registered: ‎08-01-2012

Re: xsimk makes my workstation run out of memory

But ultimately, it is about your testbench design.

 

I have testbenches for DUTs that use a 10GB Ethernet interface. I can run 10000+ IPv4 packets through the DUT and out via a AXI IC BFM (that in reality connects to a 2GB DDR) in a matter of minutes, with the sim never using more than 50Mb of ram, and I log every signal.

I still summise you're simulating and logging too much.

 

More questions:

 

I have a clock and fixed sized arrays which are driving rmii o/p ports. There is an iteration count which determines the num of frames generated. Note that the DUT has 36 ports so in the TB, 36 iterations of the rmii frame generator is running. Each generator is pushing about 100 frames.

Are the packets generated in the testbench? or read from a file? Do you generate a whole packet in a single delta cycle and stream it in clock by clock? or do you generate a single bus transaction per clock? I ask because if you create the whole packet as a signal in a large array, that is going to require a HUGE amount of ram compared to if it was a variable.(and then multiply this up if you have several channels running in parrallel).

 

I am logging a lot of signals (which are not all the signals of the DUT), because I should be able to see where a failure occurs.

Is this really necessary all the time? Especially if you have signals that are large arrays.

 

this is a huge design which has 36 Ethernet data logging ports, these frames are concentrated together, buffered in a 2 GB on-board RAM (MIG core used) and finally push out of the DUT using a Gigabit RGMII i/f.

 

Do you really need to simulate a (slow) Gigagabit controller and (especailly slow) mig. Are you verifying the MIG/Eth controller or the DUT? ideally, you would switch these things out for BFMs. Using the timing accurate models is hiddeously slow and resource hogs. If you create models of these things, you can change an hours long sim to a minutes long one. You can then do a final sim with the real models or simply skip this (as I do) and put it on hardware.

Adventurer
Adventurer
368 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

@richardhead,

 

I still summise you're simulating and logging too much.

Can likely be the cause, because I am logging many many signals.

But my colleague also simulated the same design with similar Workstation resources, logging a much smaller num of signals. He too faced the same error message of xsimk.

 

Are the packets generated in the testbench? or read from a file? Do you generate a whole packet in a single delta cycle and stream it in clock by clock?

I stream it clock by clock.

 

I ask because if you create the whole packet as a signal in a large array, that is going to require a HUGE amount of ram compared to if it was a variable.

That's a bad coding style, I am not doing it.

 

Is this really necessary all the time? Especially if you have signals that are large arrays.

You are misunderstanding or making an assumption. Those large arrays have nothing to do with my DUT.  Most of the logged signals are from my DUT. Only a bare minimum of my TB signals are logged.

 

Do you really need to simulate a (slow) Gigagabit controller and (especailly slow) mig. Are you verifying the MIG/Eth controller or the DUT?

Actually the MIG interface signals are not logged (it is working fine). I mentioned it so that you have an idea about the design. All I am interested are the RMII signals and TEMAC signals at the logging interface (36 logging ports) and the RGMII Gigabit outgoing interface.

 

Thanks Richard for your efforts to help.

 

0 Kudos
Adventurer
Adventurer
366 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory

What I am really surprised is the neutral attitude of Xilinx staff regarding this problem. Only one ask about sharing the project and then no response. Not even an attempt from them to understand the problem.

 

And then from time to time they want us to give them feedback and rate them!

Scholar drjohnsmith
Scholar
348 Views
Registered: ‎07-09-2009

Re: xsimk makes my workstation run out of memory

Its no excuse, but the majority of the xilinx people seem to work on Verilog and Linux, 

  

0 Kudos
Scholar dpaul24
Scholar
329 Views
Registered: ‎08-07-2014

Re: xsimk makes my workstation run out of memory

Another development....

 

We have made a standalone machine having i7 processor, 12 GB RAM and Win7 OS.

It has no MS Office, Antivirus or other stuff running that might occupy RAM.

 

There also we get the error message that xsimk runs out of memory.

 

--------------------------------------------------------------------------------------------------------
FPGA enthusiast!
--------------------------------------------------------------------------------------------------------
0 Kudos
Visitor xion_pecher
Visitor
228 Views
Registered: ‎12-15-2017

Re: xsimk makes my workstation run out of memory

Hello Sravanthi B,


we have the same problems with the Simulation in both versions (.1 and .2) of Vivado 2018.

I attached a small "Counter" project with TestBench and ready to run.

 

0 Kudos
Adventurer
Adventurer
53 Views
Registered: ‎01-19-2018

Re: xsimk makes my workstation run out of memory