Although the intricacies of human psychology may never be fully explained, Internet companies seem to have some parts of it figured out. By tracking millions of users, Google, Facebook, and the gaming company Zynga have learned how to position every “I agree” button, text box, and virtual cow to entice people to click.
A company called Knewton, in New York City, is now trying to use similar techniques in service of an arguably more laudable goal—helping students learn faster.
The startup, founded in 2008, offers courses like SAT preparation and remedial math that are mostly aimed at people about to start or return to college. They are offered online by schools including Arizona State University. Last November, Knewton signed a deal to use its technology in digital classes being produced by the educational giant Pearson.
“When a student takes a course powered by Knewton, we are continuously evaluating their performance, what others have done with that material before, and what [they] know,” says David Kuntz, VP of research at Knewton. He is a veteran of the education business who pioneered the introduction of computer algorithms to the design of standardized tests, like the LSAT.
Knewton calls its approach “adaptive learning,” and tracking which questions a student gets right or wrong is just the starting point. Knewton, which has raised $54 million in investments, says its software also monitors how long students take to answer a question and whether they revisit it, and even draws clues from a student’s mouse movements. “We know if they are waving their mouse around trying to decide between option A and C,” says Kuntz.
It’s an approach that, in principle, could make online courses much more powerful. Google and Amazon transformed the Web from a collection of electronic documents into an efficient economic powerhouse by tracking user activity and tweaking design to make pages more useful and consumers more likely to click ads or buy. Promoters of adaptive learning say similar analytic techniques can be applied to teaching.
Knewton’s software uses the data it gathers to try to guide each student through material in the sequence that’s most likely to make it stick. Different students can also be taught the same material in different ways, depending on their previous reactions. When teaching linear equations, Kuntz says, a student might be shown the equation “ax + b = c” and then learn that it can be plotted as a line. For others, it might help to introduce it as a geometric problem first. A student who previously responded well to information expressed visually would see graphs first, not equations. Knewton even uses a model of how fast students forget things to decide when a brush-up lesson is needed.
There is still little evidence to say whether Knewton’s technology actually teaches better. The company cites impressive results from Arizona State, where its math course was introduced in fall 2011: pass rates, for example, have increased 11 percent. However, Knewton’s technology has not yet been tested in a controlled experiment comparing groups of students who do and do not use the software.
Such validation is important, says Ken Koedinger, a professor at Carnegie Mellon University and director of the Pittsburgh Science of Learning Center, because Knewton has set itself a much harder challenge than any online advertising company. “People say Google does this and Zynga does this, but they are optimizing for a very local thing: for you to come back to the Web page,” he says. For Knewton, he says, “there’s a real danger of optimizing for the wrong thing.” For instance, if students’ lessons are optimized for quicker progress through the course, that might end up just yielding easier courses, not more learning.
Some of Koedinger’s own research was used by Carnegie Learning, a company independent of Carnegie Mellon that provides something of a cautionary tale for Knewton. In 2010, a review by the U.S. Department of Education concluded that despite some positive studies, Carnegie Learning’s “cognitive tutors” had “no discernible effects on mathematics achievement for high school students.”
Knewton is a relatively young company, and controlled trials take time and money. But Richard Clark, a professor of educational psychology and technology at the University of Southern California, says the company’s vagueness about its methods is troubling. “Knewton claims to adjust instruction for each student but does not share with anyone the evidence (if any) that they use to ground the individual adaptations,” he says. “If they are doing solid work, why not publish or at least point to the peer-reviewed studies that are the basis for their approach?”
Even so, neither Koedinger nor Clark disputes the idea that analyzing detailed data on students’ actions could shed valuable light on what makes teaching effective, helping to determine which of dozens of competing educational theories are best. Because most teaching occurs in classrooms, Koedinger says, researchers simply haven’t been able to measure it. “We know that this particular teacher has good [test] scores,” he says, “but we don’t know how they’re doing it.”