Advertisement

Smile, and JavaTutor's AI knows when you're learning online

College-age kids these days are pretty good at a few things: selfies, social oversharing and staring into screens. But can you leverage that self-obsession into a mechanism for learning? The mad scientists at North Carolina State University think so and they've got a program to prove it. Dubbed JavaTutor, the software's aimed at teaching our future workforce the basics of computer science. And it does this by tracking facial expressions -- using the Computer Expressions Recognition Toolbox, or CERT, as its base -- during online tutorial sessions. Frown and the AI knows you're frustrated; concentrate intently and the same automated emotion detection applies. So, what's the end sum of all this? Well, it seems the research team wants to gauge the effectiveness of online courses and use the cultivated feedback to better tailor the next iteration of the JavaTutor system. But the greater takeaway here, folks, is that at NCSU, online tutoring learns you!

Show full PR text

For Immediate Release

Release Date: 06.27.13

Research from North Carolina State University shows that software which tracks facial expressions can accurately assess the emotions of students engaged in interactive online learning and predict the effectiveness of online tutoring sessions.

"This work is part of a larger effort to develop artificial intelligence software to teach students computer science," says Dr. Kristy Boyer, an assistant professor of computer science at NC State and co-author of a paper on the work. "The program, JavaTutor, will not only respond to what a student knows, but to each student's feelings of frustration or engagement. This is important because research shows that student emotion plays an important role in the learning process."

The researchers used the automated Computer Expression Recognition Toolbox (CERT) program to evaluate facial expressions of 65 college students engaged in one-on-one online tutoring sessions. The researchers found that CERT was able to identify facial movements associated with learning-centered emotions, such as frustration or concentration – and that the automated program's findings were consistent with expert human assessments more than 85 percent of the time.

The researchers also had the students report how effective they felt the tutorial was, and tested the students before and after each tutoring session to measure how much they learned.

The researchers used observational data from CERT along with student self-assessments and test results to develop models that could predict how effective a tutorial session was, based on what the facial expressions of the students indicated about each student's feelings of frustration or engagement.

"This work feeds directly into the next stage of JavaTutor system development, which will enable the program to provide cognitive and emotion-based feedback to students," says Joseph Grafsgaard, a Ph.D. student at NC State and lead author of the paper.

The paper, "Automatically Recognizing Facial Expression: Predicting Engagement and Frustration," will be presented at the International Conference on Educational Data Mining, being held July 6-9 in Memphis, Tenn. The paper was co-authored by Joseph Wiggins, an undergraduate at NC State; Dr. Eric Wiebe, a professor of science, technology, engineering and math education at NC State; and Dr. James Lester, a professor of computer science at NC State. The research was supported by the National Science Foundation.