Controlling a Computer with Your Eyes

Posted: July 19, 2012 at 6:13 pm

Researchers at Imperial College London have developed an affordable technology that could allow millions of people suffering from ailments like Parkinsons, muscular dystrophy, or spinal cord injury to interact with computers--using just their eyes. The finding brings new hope to many patients that computing--and the many improvements to quality of life the computing brings--could soon be relatively simple and affordable for those who are paralyzed or otherwise disabled.

Its anyones nightmare--to suffer an injury or be diagnosed with a disease that could lead to locked-in syndrome. One feature of locked-in syndrome, though, is that occasionally mobility remains in one part of the body--the eyes. Famously, the French author of The Diving Bell and the Butterfly dictated his memoirs solely through eye-movements--one letter at a time, and with the help of an assistant.

That wont work for everyone, obviously--and nor would the expensive eye-tracking technology of years past. But the Imperial College eye-tracking technology was created with off-the-shelf materials, bringing the cost of the system down to just 40.

"We have built a 3D eye tracking system hundreds of times cheaper than commercial systems and used it to build a real-time brain machine interface that allows patients to interact more smoothly and more quickly than existing invasive technologies that are tens of thousands of times more expensive, Dr. Aldo Faisal, one of the researchers, said of the project. This is frugal innovation; developing smarter software and piggy-backing existing hardware to create devices that can help people worldwide independent of their healthcare circumstances."

The researchers demonstrated how people could play the game of Pong using just eye movements. (The video has an oddly downbeat ending, don't you think, for such a hopeful technology?)

So how does it work? The device is made up of two video game console cameras, which are attached to a pair of glasses, just outside the line of vision. That data can be transmitted over Wi-Fi or USB into a Windows or Linux computer. The device also pairs up with a bit of software that help infer just where the eyes are looking. As the video indicates, the cameras are able to discern just where the pupil is pointing; from this, it can be inferred just where on the screen a users looking. In fact, it even allows you to infer more than that--using a set of detailed calibrations, researchers can even determine how far in 3-D space the user is looking. The researchers speculate novel uses for such technology: for instance, an eye-controlled wheelchair that can determine where you want to go, just by looking.

The Imperial College team is not the only one to have tried its hands at this sort of technology, of course; among others, the University of Minnesota has been at this for some time. Back in January, Tobii Technology presented some gaze interaction tech that was actually aimed at consumers. Tobii mentioned some medical applications, but actually had in mind not patients but rather medical technicians who could use the tech to rapidly scan through photographs, scans, or X-rays, reported the LA Times.

Heres a case where consumer technology and medical technology are involving in tandem and influencing each other. Wherever it catches on first, its good news for everyone--and particularly for the millions of patients for whom this could open up new ways of interacting with the world.

See more here:
Controlling a Computer with Your Eyes

Related Posts

Comments are closed.

Archives