An electroencephalogram, or EEG, is a non-invasive brain monitoring test that involves placing electrodes into the scalp to send signals to a computer for analysis. The EEG has been widely used to study swallowing, classifying mental states and diagnosing neuropsychiatric disorders such as neurogenic pain and epilepsy, but some researchers believe they have unforeseen opportunities.
In a paper ("Emotion Recognition With Machine Learning Using EEG Signals") published on the preprint server Arxiv.org, a law school from Texas Tech University, the University of Tabriz in Iran and Akrham Hospital describes an AI system that recognizes emotions from EEG results alone.
"Emotional states are associated with a wide variety of human emotions, thoughts and behaviors; therefore, they influence our ability to act rationally, in cases such as decision making, perception, and human intelligence," they wrote. "In recent years, sensitivity recognition systems based on EEG signals [has] have been developed as a popular research topic among cognitive researchers."
EEG signals, the layer notes are challenging to analyze as they are nonlinear, somewhat random and "buried in different sound sources." To reduce the noise, researchers used an average mean reference method and determined a decomposition method for extraction. Through wavelet conversions ̵
The researchers obtained DEAP, an annotated corpus for sensitivity analysis with physiological signals, to train their emotional classifiers. This includes EEG data from 32 participants, who were told to watch 40 minute music videos and rate them on a scale of 1 to 9 in several categories, including valence (a certain video's intrinsic attractiveness / "goodness" or averseness / "bad" -it), excitement (the intensity of the physiological response it provoked), domination and emotions. Degrees in excess of 4.5 were considered "high", while grades less than 4.5 were labeled "low".
With these data, the paper writers dealt with three types of classifiers to distinguish between emotions: a k-closest neighbor algorithm, a support vector machine and an artificial neural network. All three received food from the EEG signals from 10 electrode channels near the left and right frontal brain – those regions that are closely associated with positive and negative emotions. Compared to the baseline, the most prominent classification of the three achieved 91.3 percent accuracy for excitement and 91.1 percent accuracy for valency, both in the beta frequency band.
The researchers argue that ensemble learning, an AI paradigm where a combination of machine learning systems work together to produce a single prediction, can further enhance the models' performance. But they argue that current accuracy is higher than existing algorithms applied to the DEAP data set.
"Sensitivity recognition studies using emotional signals improve brain computer interface (BCI) systems as an effective subject for clinical applications and human social interactions," the researchers say. Systems such as these can be used to investigate emotional states when considering natural aspects of emotions to highlight therapeutic agents for mental disorders such as autism spectrum disorder (ASD), attention deficit hyperactivity disorder (ADHD) and anxiety disorder. "