UPGRADE YOUR BROWSER
We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!
01-15-2019 03:33 PM - edited 01-15-2019 03:34 PM
I have created a D-FF, proceeded implementation and ran Post-Implementation-Timing-Simulation to verify it.
In the test bench, at first I added "wait for 100ns" at the start of the stimulus process.
I got the result as below figure.
I got the right result and verified that the D-FF was working well.
But when I deleted the "wait for 100ns" at the start of the stimulus process in the test bench, I got the result as below figure.
the d_in signals that occured before 100ns was ignored
and only the d_in signals that occured after 100ns was properly delivered to the D-FF.
The Run-Behavioral-Simulation works well without the "wait for 100ns"
but the Post-Implementation-Timing-Simulation doesn't work properly without it.
Why does it happen? Does the device needs time for warming up and is it 100ns?
01-15-2019 05:10 PM
In functional/timing simulation, the behavioral RTL is mapped to the logic components in the target device. That being said, the flip-flop behavioral description is replaced with library cell (e.g. FDRE) in unisim library, to be used for simulation.
When the unisim library is involved, the GSR signal is automatically asserted for the first 100 ns to simulate the reset that occurs after configuration. Therefore you need to apply stimulus data after 100 ns to account for the default Global Set/Reset (GSR) pulse used in functional and timing-based simulation.
01-15-2019 03:50 PM - edited 01-15-2019 03:51 PM
Are there some other signals involved with the design? (e.g. reset) If there was a reset, you would have to wait for it to settle.
Hope that helps
If so, please mark as solution accepted. Kudos also welcomed. :-)
01-15-2019 03:59 PM - edited 01-15-2019 04:00 PM
Thanks for your reply.
I don't have a reset and there are no signals other than d_in and d_out.
I still can't find out the reason.
My codes are below
library ieee ;
use ieee.std_logic_1164.all;
use work.all;
---------------------------------------------
entity dff is
port(
d_in: in std_logic;
clk: in std_logic;
d_out: out std_logic
);
end dff;
----------------------------------------------
architecture behavoral of dff is
signal s_in: std_logic;
signal s_out: std_logic;
begin
process(clk)
begin
if (rising_edge(clk)) then
s_out <= s_in;
end if;
end process;
s_in <= d_in;
d_out <= s_out;
end behavoral;
----------------------------------------------
01-15-2019 04:10 PM
Can you also post the testbench? If so, I will try to duplicate what you see.
01-15-2019 04:19 PM
01-15-2019 05:10 PM
Ok, I do see what you see, but a couple of things I notice...
The signals are not initialized.
Since I just pasted in your code, I didn't complete the design by adding a clock, pin assignments, unconstrained_internal_end_point, no_input_delay, no_output_delay, ... If you likewise have not done that, post implementation may not be completely valid.
Hope that helps (got to run for now)
01-15-2019 05:10 PM
In functional/timing simulation, the behavioral RTL is mapped to the logic components in the target device. That being said, the flip-flop behavioral description is replaced with library cell (e.g. FDRE) in unisim library, to be used for simulation.
When the unisim library is involved, the GSR signal is automatically asserted for the first 100 ns to simulate the reset that occurs after configuration. Therefore you need to apply stimulus data after 100 ns to account for the default Global Set/Reset (GSR) pulse used in functional and timing-based simulation.