Showing results for 
Show  only  | Search instead for 
Did you mean: 
Registered: ‎02-03-2011

Error:Xst:2035:Port <fpga_0_clk_1_sys_clk_pin> has illegal connections. This port is connected to an input buffer and other components.



I am trying to use a custom IP block interfaced with microblaze. I haven't made any connections except that the fpga_0_clk_1_sys_clk_pin which is the system clock pin is being fed to the clock input (clk_S) of the clock generator. Now, i am new to this and especially with XPS 13.1. It seems that XST adds IO buffers to the external pins and in this case it is not needed.  So,


1. How to disable adding of IO buffers in XPS 13.1 (earlier forum threads mention to disable it in ISE but in XPS 13.1, ISE isn't invoked when working with XPS)


2. Do i need to remove the clock generator? Can i feed the system clock to the microblaze, custom IP and other peripherals using PLB slave as interface?


I need to know how is it supposed to be done while configuring a system with custom IP block. Is there a better documentation available other than edk_ctt? If so, please let me know


Thanks in advance.



0 Kudos
2 Replies
Registered: ‎02-03-2011

BTW, the FPGA i am trying to interface is a Digilent Nexys2 board and i am using Windows version of 13.1 Design Suite

Tags (4)
0 Kudos
Registered: ‎09-27-2011

Ok.  I had the same error message.  I was serching for the solution and found your post.  I saw there wasn't a solution posted yet so I made a note to come back and post the solution once I found out the solution first.   I don't know if our situation matches perfectly but perhaps you can get some benifit from what I learned anyway.


If you have a clock signal coming in, if it is routed over a global clock buffer then everything that uses that clock must be after the clock buffer.  Kind of hard to go over the global clock buffer and not go over the global clock buffer at the same time.  Well the clocking wizzard automatically plops a global clock buffer into the generated core if you have it selected in the wizard which I believe it defaults to.  Once that happens you can't use the clock for anything else besides feeding the clock IP. 


I used the "Clocking Wizzard" to create the clock IP.  On the first page where you specify the frequency of the incomeing clock, on the end of that line is a drop down box in a column called Source.  You want to pop that open and select  No buffer.


This will take the gbuf out of the core.  It doesn't meen you can't use a buffer, it just means you get to declare it yourself and thus put it before both the clock block which (PLL and/or DCM) and your other logic that wants to use the input well.


There may be technical errors in my description, but it got me going and perhaps it will help you too and anyone else googleing that error message.  :smileyhappy: