Showing results for 
Show  only  | Search instead for 
Did you mean: 

“How Shall I Start, Master?”

3 5 6,862

“How Shall I Start, Master?”


Ah yes, that is always a good question.  And like many good questions, there are many good answers, but let us look at one very specific question and answer.


“Why is Programming in HDL Different than Programming a Microprocessor?”


That is very specific, and a very good question.  It seems you are learning.


A microprocessor executes instructions sequentially, one after another. This only changes when an interrupt occurs, and then the interrupt only changes the order of execution from whatever it was doing, to doing something new.  When finished with processing the interrupt, the code returns to what was interrupted, and continues to execute the instructions where it left off.


This is the way of the processor. There are multi-cores and graphics processors that have many copies of processors, but each one mindlessly marches through its code, step by step.  It only knows the instant of “NOW,” and it only knows the universe of “THIS RIGHT HERE.”


A hardware description language (HDL), like Verilog or VHDL, is different.  While a program describes an algorithm or a task, the HDL describes a circuit, or a hardware description of a design, which forms a machine to solve an algorithm or work on a task.


A program is something like:


Begin program;

$String=”Hello World”;

Print ($String);

End program;


(I have not written the program in any real language, just written it in such a way that it looks sort of like “C” code).


A compiler converts this program into something that executes on the microprocessor, replacing something simple like a print statement with what is needed to actually get something printed on a printer.  This may be very complex, and it will hide all the details (like when to squirt the ink on the ink-jet) so the programmer doesn’t need to know, or care.

A hardware description of a circuit is something like:


Begin process;

Do forever;

  When clock rises from 0 to 1;

    Read the data from the A/D converter;

    Remove all the noise;

    Low-pass filter;

    Send the result to the speaker;

  End do forever;

End process;


Again, I have not written this in any real language.  But, one sees that a program and a hardware description may each be used to solve a problem, each in its own proper syntax, and each in its own way.

But, the program does one thing and one thing only at each instant in time.  The HDL program describes everything.  If it doesn’t, it won’t work!  Did we think of everything? No.


The hardware is able to perform many things simultaneously.  For example, in the example above all the “instructions” in the “Do forever” get executed AT THE SAME TIME when the clock rises from 1 to 1.  There is no, do this, and then do that.  Instead, the A/D is read, the noise is removed, and the signal is low-pass filtered all at the same time. Right away we see a problem! At the same time a result was sent to a speaker. What result?


We need to resolve the sequence (because a sequence is needed). Such is the way of HDL…


Begin process;

Do forever;

  When clock rises from 0 to 1;

    Read the data from the A/D converter and place in A/D FIFO;

    Read out of A/D FIFO and remove all the noise and place in filter FIFO;

    Read filter FIFO and Low-pass filter and place result in speaker FIFO;

    Read speaker FIFO and send the result to the speaker;

  End do forever;

End process;


The code is not compiled, but is synthesized, which means that it is converted into real hardware:  flip flops, gates, clock buffers, I/O pins.  The design is then placed and routed.  Finally, depending on the targeted device, the design is represented by a mask set (for an ASIC), or by a bitstream (for an FPGA device).


Now we have a streaming system (what goes in continually, also comes out) where on each clock all the processes execute.  But, each process has an associated First In First Out (FIFO) buffer.  Everything is going where it should, when it should.


“But master, that is a lot harder than programming a processor!”


Ah yes.  It makes your microprocessor trained mind hurt, doesn’t it?  That is the way of hardware--one must become aware of all of time, and all of space, and learn to describe both to get what you want done.


I hope you have enjoyed this little flight of fancy.  I was inspired by all the programmers who bash their heads against HDL, only to find out that while it may look like a program, it doesn’t seem to act like one.


To learn HDL, there are courses that Xilinx offers, like:


as well as courses offered at universities and schools.

There are many on-line tutorials, and a number of good books on the subject.  One of my favorites is FPGA Prototyping by VHDL Examples, by Dr. Pong P. Chu.

Great blog, Austin. I feel I was very lucky to study logic design before the advent of hardware description languages. In my opinion the real problem with many people starting out with HDL's today is that they don't first have a solid background in logic design (schematics are usually a better starting point) before they are exposed to the HDL. So many posts in these forums result in comments like "think hardware - what sort of circuit do you expect XST to build from this description?" While I'm not ready to set aside text-based design entry and go back to pencil on vellum, I think it's important to learn the basic concepts of logic design before "programming" an FPGA. Regards, Gabor


Yes.  But, what about those who are just starting out?  No 7400 gates, no "superstrips," no wire-wrap, no logic analyzers with hundreds of wires.  How does one learn to "think" in HDL?


Probably the biggest barrier to the use of FPGA devices, and an unsolved problem to be sure.


Do schools teach logic?  Do they teach synchronous design?  Yes, and yes.  And they use FPGA devices for their labs.  It seems to me we have those who are retraining themselves, or just starting out, and have not learned the two things they need before embarking on trying to code in a HDL.


So, to all that are starting:  first go learn binary logic.  Then, go learn synchonous design.  Finally, go learn a hardware description language (verilog, or VHDL).  What can I say?  If it was easy, the pay wouldn't be as good as it is.



Great job, Austin.


This is the best synthesis about FPGA vs Microprocessor differences.


I know that is too hard for Software developers when they have to work with FPGAs, because a FPGA is not the same as a Microprocessor. But in my opinion, this article could help them to clarify some basic concepts.





Excellent job, Austin.

All time I try to explain this concept to Software developers.

This post will help them to understand more.




I find this blog interesting from the other side of the fence. I believe that I can cope with most aspects of hardware design and synchronous FPGA coding in VHDL. Where I now struggle is, with the advent of embedded processors and the SDK, I find I am expected to produce operational software to run on the embedded processor as quickly as I can produce the underlying hardware platform.


Unfortunately for me (arguably my fault, too), I have never developed any sense of software design, so coding even the smallest program in C takes me exponentially longer than firmware. Often I try to utilise the examples that Xilinx provide (for polling the XPS SPI, for example) and modify them as best I can without really understanding the underlying structure of C (syntax is one thing but the structure is another).


I am frequently stunned by the number of clearly non-hardware people who post on these forums but the lack of non-software people who post on similar software help forums.


This is my headache :smileyhappy: