Eye-tracking technology—which determines where in a visual scene people are directing their gaze—is widely used in psychology and marketing research but requires pricey hardware that has kept it from finding consumer applications.
New software from MIT and the University of Georgia, however, promises to turn any smartphone into an eye-tracking device.
“The field is kind of stuck in this chicken-and-egg loop,” says Aditya Khosla, the electrical engineering and computer science grad student who led the software’s development. “Since few people have the external devices, there’s no big incentive to develop applications for them. Since there are no applications, there’s no incentive for people to buy the devices.”
The researchers built their eye tracker using machine learning, a technique in which computers learn to perform tasks by identifying patterns in large sets of training examples.
To collect their training data, they developed a simple mobile application that flashes a small dot somewhere on a device’s screen, attracting the user’s attention, then briefly replaces it with either an “R” or an “L.” Correctly tapping either the right or left side of the screen ensures that the user has shifted his or her gaze to the intended location. The device camera continuously captures images of the user’s face.
Initial experiments, using training data drawn from 800 mobile-device users, got the system’s margin of error down to 1.5 centimeters; data on another 700 people reduced it to about a centimeter. Khosla estimates that training examples from 10,000 users will lower it to a half-centimeter, which should make the system commercially viable.