UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

Gaze tracking makes jump from assistive technology niche to mainstream with the help of a Zynq SoC

by Xilinx Employee ‎05-18-2015 02:59 PM - edited ‎05-26-2015 01:43 PM (17,413 Views)

Applications that help people with needs are a special pleasure to blog and this blog’s all about using technology to help people overcome tremendous challenges. The technology is gaze tracking, as embodied in the EyeTech Digital Systems’ AEye eye tracker. This technology performs a seemingly simple task: figure out where someone is looking. The measuring techniques have been known since 1901. Implementation? Well that’s taken more than 100 years of development and EyeTech has been at the forefront of this work for almost two decades. Here’s the current gaze-tracking process flow used by EyeTech:

 

 

 Eyetech Gaze Tracking Process Flow.jpg

 

 

 

Originally, EyeTech used commercial analog video cameras and PCs to create a “Windows mouse” that could be controlled with nothing more than eye positioning. EyeTech’s eye-tracking technology determines gaze direction from pupil position and 850nm IR light reflections from the human cornea. The major markets for this technology were originally for disabled users who needed assistive technology to more fully interact with the world at large. These disabilities are caused by numerous factors including ALS, cerebral palsy, muscular dystrophy, spinal cord injuries, traumatic brain injuries, and stroke. Eye-tracking technology makes a large, qualitative difference in the lives of people affected by these challenges. There’s an entire page full of video testimonials to the transformative power of this technology on the EyeTech Web site.

 

However important the assistive technology market, it’s relatively small and Robert Chappell, EyeTech’s founder, realized that the technology could have far more utility for a much larger user base if he could reduce the implementation costs, size, and power consumption. Here were Chappell’s goals:

 

  • Stand-alone operation (no PC needed)
  • “Compact” size
  • Low power (< 5W)
  • Low cost (< $200)
  • Superior eye-tracking capability
  • Multi-OS support
  • Field upgradeable
  • Reasonable development time and costs

 

These are not huge hurdles for typical embedded systems but when your algorithms require a PC to handle the processing load, these goals for an embedded version present some significant design challenges. No doubt, Chappell and his team would have used a microcontroller if they could have found a suitable device with sufficient processing horsepower. But with the existing code running on PC-class x86 processors, shrinking the task into one device was not easy.

 

Chappell learned about the Xilinx Zynq SoC at exactly the right time and it seemed like exactly the right type of device for his project. The Zynq SoC’s on-chip, dual-core ARM Cortex-A9 MPCore processors could run the existing PC-based code with a recompilation and deliver an operable system. Then, Chappell’s team could gradually move the most performance-hungry tasks to the Zynq SoC’s on-chip PL (programmable logic) to accelerate sections of the code. Porting the code took two years and the team size varied from two to four engineers working part time on the project.

 

Ultimately, the project resulted in a product that can track a gaze at frame rates ranging from 40 to 200+ frames/sec. Many gaze-tracking applications can use the slower frame rates but certain applications such as testing for brain injuries requires the faster frame rate for an accurate result.

 

Here’s a photo of the resulting AEye pc board:

 

 

Eyetech AEye Module with Zynq Z-7020.jpg

 

 

This is a fairly small board! A Zynq Z-7020 SoC measures 17mm on a side and the board is only slightly taller than the Zynq SoC package. Note the US dime shown on the right of the above image for a size comparison. Here’s a hardware block diagram of the AEye board:

 

 

Eyetech AEye Module Block Diagram.jpg

 

 

 

And here’s how EyeTech has apportioned the tasks between the Zynq SoC’s PS (processor system) and PL:

 

 

Eyetech Gaze Tracking Task Allocation.jpg

 

 

Chappell notes that the availability of a high-performance PS and PL in the Zynq SoC made for an ideal rapid-development environment because the boundary between the hardware and software is not rigid. The ability to move tasks from the PS to the PL is what permitted the design team to achieve better than 200 fps frame rates.

 

How mainstream could this gaze-tracking technology get? How about one such eye tracker per car to help fight driver fatigue; chemically induced inattention; and distraction from cell phones, tablets, and the like? If proven effective, insurance companies may soon be lobbying for this feature to be made mandatory in new cars. Science fiction? Just watch this video from Channel 3 TV news in Mesa, AZ.

 

 

(Note: This blog is a summary of a presentation made by Robert Chappell and Dan Isaacs of Xilinx at last week’s Embedded Vision Summit 2015, held in Santa Clara, CA.)

 

Labels
About the Author
  • Be sure to join the Xilinx LinkedIn group to get an update for every new Xcell Daily post! ******************** Steve Leibson is the Director of Strategic Marketing and Business Planning at Xilinx. He started as a system design engineer at HP in the early days of desktop computing, then switched to EDA at Cadnetix, and subsequently became a technical editor for EDN Magazine. He's served as Editor in Chief of EDN Magazine, Embedded Developers Journal, and Microprocessor Report. He has extensive experience in computing, microprocessors, microcontrollers, embedded systems design, design IP, EDA, and programmable logic.