Showing results for 
Show  only  | Search instead for 
Did you mean: 
Registered: ‎07-19-2020

RISC V Simulation

How can I simulate a RISC V Core and its extensions?

I have been given a task by my professor to simulate the BitManip extensions of the RISC V Core for a Computer Vision project.
Here is the reference paper he provided me:

Can anybody guide me through the process?


0 Kudos
1 Reply
Registered: ‎05-21-2015


As someone who has built their own CPU, it really depends.  It depends upon the CPU.  It depends upon its capabilities.  Sorry, but RISC-V isn't quite specific enough to answer those questions.  You might need to look into the specific CPU's implementation.

In my case, when I simulate the ZipCPU (my own creation), I have to simulate an entire ecosystem as well.  I'm therefore simulating the CPU, serial port, flash memory, and sometimes more--SD card, video, ethernet, you name it.  I've posted many of these simulations on github, and discussed most of them on my blog at  For example, here is a design containing at first a ZipCPU but now/finally a PicoRV32 RISC-V CPU.

Most of my simulations these days have one of two modes.

  1. The program is loaded before simulation time starts.  In this case, my simulation script loads the "program" into the simulated flash (see my website for a discussion) and then the simulation starts.  The CPU reads the program from flash, and we're off.  Sometimes I load direct to block RAM.  The ELF file loader will read the start address from the program file to know where to start the CPU.  This method works great when I can trust the ELF loader to do its thing.
  2. If I can't trust the loader to be equivalent to starting with a program loaded, then I'll start with the CPU halted and then interact with the simulation as I load the program into the SDRAM/block RAM/Flash.  Once the loader completes, I'll start the CPU and see what it does.  Along the way, I can capture a VCD file describing *EVERYTHING* that happens--very useful for debugging, but it's also time consuming and the trace can consume 5-10GB of disk space very quickly and easily.

When booting from the SD-Card, I typically point the simulation at a file to use as an SD-Card disk.  The simulation then reads and writes to that "disk" as the CPU wants it to.  Sure, it's slow, but it works.  When I'm not running the simulation, I can then "mount" the SD card onto my system using Linux loop-back mode and adjust any files on it as necessary.

I've done most of my simulation work to date using Verilator.  I imagine Vivado has some simulation capabilities built in, but most of my board simulation requirements are for C/C++ models not RTL models and ... I haven't (yet) figured out how (or if) Vivado supports integrating C/C++ simulation models to even try that approach.  To date, therefore, Verilator has met my needs quite well.


Tags (1)