UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

ARM TechCon: EyeTech eye-tracking demo shows the power, flexibility of localized video processing

by Xilinx Employee ‎10-31-2013 01:58 PM - edited ‎02-06-2014 01:12 PM (12,915 Views)

EyeTech Digital Systems has been making eye-tracking systems for research projects and for the disabled markets for more than fifteen years. The company’s initial products combined IR LED illumination and an image sensor with PC-based processing to create a gaze-based user interface (UI) based on detecting the direction of a user’s gaze. This “Augmentative and Alternative Communication” hardware allows severely disabled users recapture what they thought they lost forever: the ability to communicate with others. The downside of the original design was that it relied on the large and ever-growing power of a PC’s x86 microprocessor to reduce sensor imagery to computer commands through gaze-tracking algorithms.

 

In the video interview below, EyeTech’s President and CEO Robert Chappell explains that the unique combination of a dual-core ARM Cortex-A9 processor and programmable logic supplied together in the Xilinx Zynq All Programmable SoC permitted his company to take the gaze-based interface to the next level, where no PC is needed. This jump in capabilities is important to take the technology out of the PC’s realm and into tablet-based and embedded applications.

 

At first, explains Chappell, EyeTech directly ported the existing gaze-based UI code to one of the ARM Cortex-A9 processors on the Zynq SoC. The ported code ran well and resulted in a working product that became operational very quickly. However, EyeTech took another step further and implemented newer, more powerful processing algorithms (low-level pixel processing, filtering, and convolution) in the Zynq SoC’s programmable logic. These new algorithms improved the resolution and accuracy of the product’s gaze-detection to a fraction of a degree, opening other markets and applications for the EyeTech gaze-based UI including automotive applications and kiosks. The EyeTech experience is a perfect example of a product evolving from the existing computer vision sphere into the realm of Smarter Vision.

 

The resulting product fits inside of a slim bar that is located below the display being controlled. Inside of the bar are the IR illumination LEDs, the image sensor, and the Zynq All Programmable SoC. Here is a photo of the EyeTech sensor bar.

 

 

EyeTech Sensor Bar.jpg 

 

 

The image sensor with lens is in the center of the bar and the IR illumination LEDs are at the left and right ends. The Zynq SoC is located just to the left of the sensor lens and is attached to the back side of the board. Here’s a photo of the back side of the board:

 

EyeTech Board with Zynq All Programmable SoC.jpg

 

 

 

Finally, here’s a video interview with Robert Chappell made by ARM during this week’s ARM TechCon that explains the product’s development history:

 

 

http://www.youtube.com/watch?v=rdVuOHpcCuY

 

For Part 2, click here.

Labels
About the Author
  • Be sure to join the Xilinx LinkedIn group to get an update for every new Xcell Daily post! ******************** Steve Leibson is the Director of Strategic Marketing and Business Planning at Xilinx. He started as a system design engineer at HP in the early days of desktop computing, then switched to EDA at Cadnetix, and subsequently became a technical editor for EDN Magazine. He's served as Editor in Chief of EDN Magazine, Embedded Developers Journal, and Microprocessor Report. He has extensive experience in computing, microprocessors, microcontrollers, embedded systems design, design IP, EDA, and programmable logic.