Tuesday, July 29, 2008 | A recorded physics lecture plays on a computer screen as Jacob Whitehill’s face erupts into a wide grin. Immediately, the onscreen professor who is scribbling formulas on a whiteboard springs into high speed, his voice squeaky and high-pitched as if Whitehill had pressed fast forward on a remote control. As the smile subsides and Whitehill’s face relaxes, the bearded professor simultaneously slows, and his movements and speech return to a normal pace and pitch as if Whitehill had pressed play.

But Whitehill isn’t wielding a remote control. Instead, he has a web camera pointed at his face.

A third year computer science graduate student, Whitehill and his colleagues are working to make a new generation of robots that would be effective and responsive teachers. They believe the key is to train them to recognize and respond to facial expressions, the way humans do naturally. Whitehill described the demonstration, part of his research at the University of California, San Diego’s Machine Perception Laboratory, as “almost like having a remote control built into your face.”

The research falls under the umbrella of artificial intelligence — the creation of machines that can behave like humans — and Whitehill envisions a not-so-far-away future when robots will replace people as teachers, at least in areas that require a lot of repetition, such as foreign language and math drills. He doesn’t, however, foresee them ever replacing philosophy teachers, for example.

“Mundane subjects or those that try a teacher’s patience would be good for robot teachers,” Whitehill said. “Robots have infinite patience.”

Whitehill’s research builds on automated software developed at the lab that can detect facial expressions. The software is part of the lab’s focus on using robots to improve education and boost robot-human interaction.

“Computers are already powerful enough to sustain useful robots that interact and assist humans in everyday life,” said Javier Movellan, the lab’s director. “Now progress requires a scientific shakedown in goals and methods not unlike the cognitive revolution that occurred 40 years ago.”

Toward that end, Whitehill has demonstrated that humans can use their face as a “remote control” to speed up or slow down a recorded video lecture. When a facial expression indicates that a student is puzzled or confused by the material, the recording slows to a below-average speed. On the other hand, if an expression indicates to the robot that the student understands the material, the lecture speeds up and moves to the next topic more quickly.

Eventually, the technology could be installed in a robot such as RUBI, a three-foot-tall robot with a screen on its stomach that the lab created to teach songs, colors and shapes to preschool students at the university. So far, however, the detection abilities of Whitehill’s program are somewhat limited, he said.

In a recent pilot study, Whitehill programmed the computer so that two simple expressions could be used to control the video player — smile for fast-forward and a nose wrinkle for rewind. While a smile and nose wrinkle don’t necessarily indicate comprehension or confusion, respectively, the successful programming proved that using facial expression as a remote control signal to a robot is feasible. Moreover, the study showed that two of the researchers’ major assertions were correct: the facial expressions people make while watching recorded lectures can be used to predict both a student’s preferred viewing speed of a video and how difficult a student perceives a lecture at each moment.

“So, in later research, when the computer recognizes that a student is perceiving a lecture to be really hard or perceives that the user wants the robot to slow down, it will accommodate,” Whitehill said.

In some ways, the idea of using computers as teachers is an extension of the growing popularity of distance learning, whereby students take online, audio or video classes, live or recorded, instead of physically attending a traditional classroom. The classes usually rely on video conferencing or internet-based programs, and students have very limited, if any, face-to-face interaction with teachers. The trend’s popularity has spread from colleges to earlier grades.

Robot teachers able to perceive facial expressions could sidestep one of the biggest criticisms of the new style of learning, namely that the line of communication between teachers and students can be confusing and awkward because teachers have a hard time gauging performance without seeing students’ facial expressions.

“If I’m a student with a robot teacher and I can say, ‘Yeah, speed up, come on, I understand you,’ or ‘Hey, I’m losing you, slow down a second,’ then that’s really good, really helpful,” Whitehill said. “If I’m completely puzzled and yet the robot keeps giving me more information, that’s not going to be very useful.”

The study and accompanying research papers were presented and accepted at two peer-reviewed academic conferences last month. The next step is to improve the robot’s ability to recognize a wider array of expressions. Ultimately, Whitehill said, the research will be personalized and “user specific models” will be trained to know whether a lecture should be sped up or slowed down based on one particular student’s unique expressions.

“It’s not going to revolutionize the world, but it’s promising,” Whitehill, 28, said.

A student himself, Whitehill’s moment of inspiration struck while he was trying to take in a three hour lecture via video.

“It was very boring,” he said. “I was tuning in and out and I thought how much I wished I could get through it a bit quicker.”

Familiar with the lab’s expression recognition software and the RUBI (Robot Using Bayesian Inference) project, he reasoned that the technology could be used to control playback speeds and instigated the study.

Here’s how it worked: Eight subjects viewed an assortment of short university lectures on math, physics and philosophy, among others, while the detector software measured facial muscle movements in real time as the students watched the lecture. At the same time, the students, moment-by-moment, moved a lever up and down to indicate how difficult the material was for them to understand. Next, Whitehill used the lab’s software to test how well the robot gauged students’ understanding of the material with the difficulty levels the students themselves reported. He found out that the robot on average was right 40 percent of the time. Statistically speaking, those are significant and promising results, according to university experts.

“Jake’s research is exemplary of the kind of education-oriented research that we hope will become an increasing part of the center’s research output,” said Gary Cottrell, a computer science professor at UCSD and director of the university’s Temporal Dynamics of Learning Center. The center, which is sponsored by the National Science Foundation, is backing Whitehill’s research, he said.

Since the study, Whitehill has used about 20 expressions, such as raised eyebrows, squinted eyes and pursed lips, to train the robot to speed up or slow down, depending on the expression it recognizes. The web cam locks onto the subject’s face, triggering the “face detector” technology. Onscreen, the 20 expressions are assigned frames with bar graphs that indicate when the particular expression is detected. As a smile registers, for example, the smile bar graph rises, similar to the way a hospital vital signs monitor rises and falls.

Whitehill and his colleagues believe that eventually advanced robot teachers programmed to respond to a particular student’s unique expressions, gestures and even moods could relieve teachers of perfunctory duties and allow them more time to delve into more nuanced areas of study, such as history and philosophy. Next targets for RUBI, for example, include being able to point, hold hands or hug the children.

Several educators interviewed for this story said they didn’t have any real concerns about computerized teachers having an increasing presence in the classroom, but that robots could never replace flesh and blood teachers.

Whitehill agreed.

“Only a human teacher can put their heart into it,” he said, “A robot can never really care.”

Please contact Darryn Bennett directly at darryn.bennett@voiceofsandiego.org with your thoughts, ideas, personal stories or tips. Or set the tone of the debate with a letter to the editor.

Leave a comment

We expect all commenters to be constructive and civil. We reserve the right to delete comments without explanation. You are welcome to flag comments to us. You are welcome to submit an opinion piece for our editors to review.

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.