MIT researchers build software to control phones with your eyes
Facebook Messenger bots, Artificial Intelligence bots that are now settling divorce cases in Europe, Siri, Cortana, Virtual Reality. 5 years from now, there is a chance people might begin to form a dialogue that could be summed up with one single statement,”Is it still a thing?”.
There’s a chance these amazing constructs which once seemed impossible might go out of fashion.
How about Google’s self-driving cars? That’s a promising sell for a revolutionary future in the tech world. But we all know how the autopilot system, including Tesla’s, played a hand in a way that resulted in deaths of civilians and expensive litigations against the companies’ credibility.
But a research paper recently presented at the IEEE conference held at Seattle, Washington, may well be the next big thing.
It can be likened to the popular Marvel Superhero Iron Man, who’s seen on a lot of occasions interacting with his A.I – J.A.R.V.I.S and control some of his suit’s actions via eye-tracking while inside his suit.
Cool, isn’t it? Well, we’re a long way from Iron Man’s eye-controlled integrated HUD helmet system.
But, MIT’s new software could take us there.
The research paper which says “Eye Tracking for everyone” as its title, is a software that aims to help consumers operate their hardware using their eye movements, so that you can give your fingers a rest and save your screens a lot of swipes.
THE MEN BEHIND THE SCENES
Through the collective efforts of researchers from MIT, University of Georgia-US, Max Planck Institute for informatics-Germany, a software that tracks your eye movements and uses it as an input for your hardware, is in the works.
Offering his two cents is Mr.Aditya Khosla, an Indian and an MIT graduate, who said that the software has an accuracy of 1 centimetre on mobile phones and an additional 0.7 centimetres on a Tablet.
Which is not a lot of precision. At least not enough to be included in a mobile phone for a seamless experience.
Aditya Khosla, who’s also a part of the research team, is positive that the accuracy can be improved further with more eye-tracking data and for that, the team is funneling in gaze information from the general public as part of its crowdsourcing for the project, to help the software better understand where the user is looking.
So, if and when the software learns to identify and track your eye movements, it has certain obvious implications in terms of usage – mobile games, screen navigation,
HOW RESEARCHERS WENT ABOUT IT
At some level, MIT researchers’ latest creation works like an Artificial Intelligence. The software had to be taught how people look at their phones in operation. It had to be fed with Gigabytes, Terabytes of gaze information
The software had to be taught how people look at their phones in operation. It had to be fed with Gigabytes, Terabytes of gaze information.
Where a person sets his sight before opening an application. How a person’s eyes shift as he swipes away across the screen.
A lot of such scenarios factored in. Every permutation, every combination.
For this, researchers created an app going by the name “GazeCapture” that had to do the heavy-lifting recording all the eye information.
All of this information was then fed to the actual app “iTracker” which has its own set of parameters that it analyses such as the head movement, position of your face.
A reported 1500 test subjects have used the GazeCapture app for it to study and track their eye movements. Should they be able to get another 9000 subjects to use the app, iCapture will be informatically equipped enough for use on mobiles and other hardware, according to Aditya Khosla.
A GREAT CONTRIBUTION TO THE FIELD OF MEDICINE
A software such as this, can be tweaked and installed in instruments that study patients’ eye movements.
A neuro-ophthalmologist could observe anomalies in patients’ eyes and diagnose possible aneurysms and other brain disorders.
It can also applications in studying glaucoma, schizophrenia and the like.