Artificial intelligence on the doorstep of your soul: how far can machines go in understanding human emotions?
The group behind Stable Diffusion plans to create open source code for AI that recognizes emotions.
In 2019, Amazon updated its Alexa assistant, adding a feature that allowed you to detect when a user is probably annoyed. For example, if Alexa was playing the wrong song and the user said “No, Alexa” in an irritated tone, the assistant might apologize and ask for clarification.
Now the group that created one of the datasets for training the Stable Diffusion text-to-image model wants to provide similar emotion recognition capabilities to every developer for free.
This week, the non-profit organization LAION, which creates data sets for training generative AI, including Stable Diffusion, announced the Open Empathic project. The goal of the project is to “equip open AI systems with empathy and emotional intelligence,” the group said in a statement.
The LAION team decided to fill a gap in the open community where emotional AI was largely ignored. Through Open Empathic, LAION enlists volunteers to provide audio recordings to a database that can be used to create AI that understands human emotions.
LAION, which stands for “Large-scale Artificial Intelligence Open Network”, was founded in early 2021 by Christoph Schumann, a teacher from Germany, and several members of the Discord server for AI enthusiasts. LAION’s main mission is to democratize AI research and development.
In addition to Amazon’s attempts with Alexa, many companies are exploring the development of AI that can recognize emotions. However, most emotion recognition systems are based on an unstable scientific foundation. The main problem with AI that recognizes emotions is bias.
The LAION team envisions useful applications of the technology in robotics, psychology, vocational training, education, and even gaming. However, they are also aware of the risks and challenges associated with using AI to understand emotions.
LAION is confident in an open development process and welcomes researchers to make changes and identify problems.
“We invite researchers to poke around, suggest changes, and identify problems,” Kaczmarczyk said. “And just as Wikipedia thrives on community input, Open Empathic is fueled by community engagement, ensuring transparency and security.”
Transparent? Sure. Safe? Time will show.