You can control computers with simple eye movements

In London A researcher group has developed a device which could enable millions suffering from nerve degenerative diseases and amputees to interact with their computers and surroundings using simple eye movements.


The eye-tracking devices and “smart” software, which costs less than 40 pounds, can help patients suffering from multiple sclerosis, Parkinson’s, muscular dystrophy and spinal cord to interact with the computer freely.

According to a study, published in The Journal of Neural Engineering, the device could even allow people to control an electronic wheelchair simply by looking where they want to go or control a robotic prosthetic arm.

Now control computers with simple eye movements

Composed from off-the-shelf materials, the new device can work out exactly where a person is looking by tracking their eye movements, allowing them to control a cursor on a screen just like a normal computer mouse.

Researchers from Imperial College London demonstrated its functionality by getting a group of people to play the classic computer game Pong without any kind of handset. Users were also able to browse the web and write emails “hands-off,”

Aldo Faisal, a lecturer in neurotechnology at Imperial’s Department of Bioengineering, is confident in the ability to utilise eye movements given that six of the subjects, who had never used their eyes as a control input before, could still register a respectable score within 20 per cent of the able-bodied users after just 10 minutes of using the device for the first time.

The commercially viable device uses just one watt of power and can transmit data wirelessly over Wi-Fi or via USB into any computer.

The GT3D system has also solved the ‘Midas touch problem’, allowing users to click on an item on the screen using their eyes, instead of a mouse button.

“Crucially, we have achieved two things: We have built a 3D eye tracking system hundreds of times cheaper than commercial systems and used it to build a real-time brain machine interface that allows patients to interact more smoothly and more quickly than existing invasive technologies

that are tens of thousands of times more expensive,” he said.

“This is frugal innovation. Developing smarter software and piggy-backing existing hardware to create devices that can help people worldwide independent of their health care circumstances.”

The cameras constantly take pictures of the eye, working out where the pupil is pointing, and from this, the researchers can use a set of calibrations to work out exactly where a person is looking on the screen.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s