UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Scholar helmutforren
Scholar
2,013 Views
Registered: ‎06-23-2014

Cannot support older hw_server version 2017.1... vs 2018.1 at the same time?

Jump to solution

I'm on Windows 10.  I need to run BOTH Vivado 2017.1 and 2018.1 at the same time, with a project or two open in one or two 2017.1 windows, and a project or two open in one or two 2018.1 windows. 

 

It has worked fine in the past to have multiple 2017.1 windows/projects open simultaneously, with for example two projects building bitstreams, a third project running a simulation, and a fourth project in my foreground as I edit RTL.

 

I installed Vivado 2018.1 recently, and for a while I was running multiple copies just fine, as described.  This included a single Vivado 2018.1 window that was running the Hello World example program.

 

The interesting thing is that I was able to go back and forth as follows.  A 2017.1 bitstream generation would finish, and I'd connect to the hardware manager and program my one and only KC705 dev board's FPGA.  Then I'd start up a new 2017.1 bitstream generation and focus my attention on the 2018.1 Hello World project.  After the Hello World bitstream was generated, I was able to successfully program it to the FPGA and run it instead.  I did indeed get an error message window pop up, saying "[Labtools 27-3366] Cannot support older hw_server version 2017.1".  But, the 2018.1 hardware manager continued NEVERTHELESS.

 

Today, however, after creating a new fresh 2018.1 project, I get that same error and can NOT move forward.  The previously and still open 2018.1 Hello World project *ALSO* can NOT move forward after getting the error.

 

So maybe I had some funny in-between state where 2018.1 complained but then continued forward with the 2017.1 hw_server.

 

SO TO THE POINT: It seems like the 2018.1 hw_server might not be installed.  If I get it installed, am I going to be able to program my 2017.1 bitstreams from 2017.1?  At a minimum, I need to have both a 2017.1 and 2018.1 loaded continuously (because it costs far too much time to exit/reload repeatedly).  Then, at separate times, I need to use one hardware manager or the other, 2017.1 or 2018.1, even if I must do a little dance when I switch.  Perhaps my question is, "what dance do I need to dance?"  That is, can I *manually* close and load a hw_server?  (Note this is all on my local machine.)

0 Kudos
1 Solution

Accepted Solutions
Scholar helmutforren
Scholar
2,310 Views
Registered: ‎06-23-2014

Re: Cannot support older hw_server version 2017.1... vs 2018.1 at the same time?

Jump to solution

SOLUTION: The first time you try to auto connect to the target, that version of Vivado will load the corresponding version of the hw_manager.  So simply be sure to do this from 2018.1 (or your latest version).  Then the older 2017.1 (or your prior version) should work with the latest hw_manager.  If you have multiple copies of multiple versions of Vivado already running and you hit the version error, just make sure to close the hardware manager in all running copies.  Then wait a minute or so.  The already running hardware manager will stop and go away, allowing you to cause its restart from the desired 2018.1 (or your latest) version.

 

No need to change to ethernet JTAG or Xilinx Platform Cable USB II.

 

Below is what I wrote before figuring this out...

--------------------------------------------------------------------

 

more...

 

I just went to test the Xilinx Platform Cable USB II.  Before that, I thought to do a baseline.  Now I have both 2017.1 and 2018.1 working again!

 

How?  Maybe it's because I CLOSED the hardware manager in all four running copies of Vivado (at the moment 2 x 2017.1 and 2 x 2018.1).  Then when I opened the hardware manager from 2018.1 and tried to connect to target, I got a Windows pop up message about whether or not I wanted to allow so-and-so (sorry I didn't write it down, but it was hardware something) to do things.  I said "yes".  (I erroneously programmed FPGA from Vivado, then loaded SDK and properly programmed and ran Hello World in debug mode.  This proper way ALSO worked.)

 

Just now, the second 2018.1 connected to target successfully.  This is a NON-IP-Integrator, Non-Block-Design project.

 

Subsequently, I re-opened the hardware manager on a 2017.1 running.  Now it ALSO works.

 

So now I am back to ALL working again.

 

Is it perhaps the case that the hw_server gets loaded by the first Vivado such that I need to make sure that the first Vivado to load it is 2018.1?  That is, in my test just now, did I cause hw_server to exit when I closed all hardware managers.  Then when I tried to open hardware manager and target from 2018.1 it this time loaded the NEWER 2018.1 hw_server, which supports both 2017.1 and 2018.1?

 

Is there a way to tell WHICH hw_server is currently running?  Perhaps so.  Hardware tab shows "localhost", with Hardware Server Properties.  Ah ha!  It says VERSION is 2018.1.  I see the SAME from BOTH 2017.1 and 2018.1.  So it *is* as I described in the prior paragraph.

 

I'll close all hardware managers again.  Reopen from 2017.1.  Nope, still server version 2018.1.  Maybe I need to wait longer.  YEP.  From 2017.1 reopen hardware manager and auto connect to target... Now server version 2017.1.  Try 2018.1.  Auto connect error AS EXPECTED.  Close all.  Wait roughly a minute.  From 2018.1 reopen hardware manager and auto connect to target... No error this time.  server version is 2018.1 AS EXPECTED.  Go back to 2017.1 project.  reopen, auto connect.  version 2018.1.  Works.  All Good.

 

 

View solution in original post

Tags (1)
3 Replies
Scholar jmcclusk
Scholar
1,999 Views
Registered: ‎02-24-2014

Re: Cannot support older hw_server version 2017.1... vs 2018.1 at the same time?

Jump to solution

This may sound stupid, but if you buy one of those $500 ethernet JTAG cables, this makes the hw_server problem on your PC go away.   Either version of Vivado should be able to connect to the remote JTAG over ethernet.

Don't forget to close a thread when possible by accepting a post as a solution.
0 Kudos
Scholar helmutforren
Scholar
1,984 Views
Registered: ‎06-23-2014

Re: Cannot support older hw_server version 2017.1... vs 2018.1 at the same time?

Jump to solution

I've been using the KC705's built-in micro-usb connector, behind which I see a small Digilent module (board). 

 

I already have a Xilinx Platform Cable USB II.  Doc UG810 identifies J60 as the place on the KC705 to use.

 

@jmcclusk , do you know if the Xilinx Platform Cable USB II would solve my problem equally as well?

 

If not, I need to pursue software solutions before requisitioning another JTAG device...

0 Kudos
Scholar helmutforren
Scholar
2,311 Views
Registered: ‎06-23-2014

Re: Cannot support older hw_server version 2017.1... vs 2018.1 at the same time?

Jump to solution

SOLUTION: The first time you try to auto connect to the target, that version of Vivado will load the corresponding version of the hw_manager.  So simply be sure to do this from 2018.1 (or your latest version).  Then the older 2017.1 (or your prior version) should work with the latest hw_manager.  If you have multiple copies of multiple versions of Vivado already running and you hit the version error, just make sure to close the hardware manager in all running copies.  Then wait a minute or so.  The already running hardware manager will stop and go away, allowing you to cause its restart from the desired 2018.1 (or your latest) version.

 

No need to change to ethernet JTAG or Xilinx Platform Cable USB II.

 

Below is what I wrote before figuring this out...

--------------------------------------------------------------------

 

more...

 

I just went to test the Xilinx Platform Cable USB II.  Before that, I thought to do a baseline.  Now I have both 2017.1 and 2018.1 working again!

 

How?  Maybe it's because I CLOSED the hardware manager in all four running copies of Vivado (at the moment 2 x 2017.1 and 2 x 2018.1).  Then when I opened the hardware manager from 2018.1 and tried to connect to target, I got a Windows pop up message about whether or not I wanted to allow so-and-so (sorry I didn't write it down, but it was hardware something) to do things.  I said "yes".  (I erroneously programmed FPGA from Vivado, then loaded SDK and properly programmed and ran Hello World in debug mode.  This proper way ALSO worked.)

 

Just now, the second 2018.1 connected to target successfully.  This is a NON-IP-Integrator, Non-Block-Design project.

 

Subsequently, I re-opened the hardware manager on a 2017.1 running.  Now it ALSO works.

 

So now I am back to ALL working again.

 

Is it perhaps the case that the hw_server gets loaded by the first Vivado such that I need to make sure that the first Vivado to load it is 2018.1?  That is, in my test just now, did I cause hw_server to exit when I closed all hardware managers.  Then when I tried to open hardware manager and target from 2018.1 it this time loaded the NEWER 2018.1 hw_server, which supports both 2017.1 and 2018.1?

 

Is there a way to tell WHICH hw_server is currently running?  Perhaps so.  Hardware tab shows "localhost", with Hardware Server Properties.  Ah ha!  It says VERSION is 2018.1.  I see the SAME from BOTH 2017.1 and 2018.1.  So it *is* as I described in the prior paragraph.

 

I'll close all hardware managers again.  Reopen from 2017.1.  Nope, still server version 2018.1.  Maybe I need to wait longer.  YEP.  From 2017.1 reopen hardware manager and auto connect to target... Now server version 2017.1.  Try 2018.1.  Auto connect error AS EXPECTED.  Close all.  Wait roughly a minute.  From 2018.1 reopen hardware manager and auto connect to target... No error this time.  server version is 2018.1 AS EXPECTED.  Go back to 2017.1 project.  reopen, auto connect.  version 2018.1.  Works.  All Good.

 

 

View solution in original post

Tags (1)