Based on the responses given by student, as well as recognized facial expressions and body posture, the software is capable to identify frustration or boredom.
After sensing a negative state of mind, the software changes its teaching strategy to help students overcome emotional barriers in learning.
"Most of the 20th-century systems required humans to communicate with computers through windows, icons, menus, and pointing devices," said Sidney D'Mello, a professor of psychology who specializes in human-computer interaction and artificial intelligence in education.
By applying a technology that resembles eye contact in human-to-human interaction, the software "AutoTutor" and "Affective AutoTutor" is able to more effectively teach, depending on a student's motivation level and social dynamics.
AutoTutor teaches complex technical content in Newtonian physics, computer literacy, and critical thinking by holding a conversation in natural language, according to the research report published in ACM Transactions on Interactive Intelligent Systems. The software is designed to find and correct misconceptions and keeping students engaged with images, animations, and simulations. Affective AutoTutor is the component that adds emotion-sensitive capabilities by monitoring facial features, body language, and conversational cues.
"Much like a gifted human tutor, AutoTutor and Affective AutoTutor attempt to keep the student balanced between the extremes of boredom and bewilderment by subtly modulating the pace, direction, and complexity of the learning task," D'Mello says. The researcher noted that AutoTutor has been tested with more than 1000 students and delivered "learning gains of approximately one letter grade".