Close Menu
Core Bulletin

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Red-letter day as gemologists discover why crimson diamonds are so rare | Science

    June 25, 2025

    NATO leaders agree on spending hike, vow to defend each other

    June 25, 2025

    Unite accused of ‘targeting’ school linked to councillor

    June 25, 2025
    Facebook X (Twitter) Instagram
    Core BulletinCore Bulletin
    Trending
    • Red-letter day as gemologists discover why crimson diamonds are so rare | Science
    • NATO leaders agree on spending hike, vow to defend each other
    • Unite accused of ‘targeting’ school linked to councillor
    • Households owe billions ahead of bill rise
    • Drone maker AeroVironment shares pop 20% on earnings beat
    • Full line-up, stage times and secret sets
    • Jess Cartner-Morley on fashion: travel trousers – the gateway drug to smart comfy dressing | Fashion
    • Durham v Sussex, Nottinghamshire v Yorkshire and more: county cricket day four – live | County Championship
    Wednesday, June 25
    • Home
    • Business
    • Health
    • Lifestyle
    • Politics
    • Science
    • Sports
    • Travel
    • World
    • Technology
    • Entertainment
    Core Bulletin
    Home»Technology»New data highlights the race to build more empathetic language models
    Technology

    New data highlights the race to build more empathetic language models

    By Liam PorterJune 25, 2025No Comments5 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Telegram Email
    grapic depiction of white soundwaves on pinkish background
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Measuring AI progress has usually meant testing scientific knowledge or logical reasoning — but while the major benchmarks still focus on left-brain logic skills, there’s been a quiet push within AI companies to make models more emotionally intelligent. As foundation models compete on soft measures like user preference and “feeling the AGI,” having a good command of human emotions may be more important than hard analytic skills.

    One sign of that focus came on Friday, when prominent open source group LAION released a suite of open source tools focused entirely on emotional intelligence. Called EmoNet, the release focuses on interpreting emotions from voice recordings or facial photography, a focus that reflects how the creators view emotional intelligence as a central challenge for the next generation of models.

    “The ability to accurately estimate emotions is a critical first step,” the group wrote in its announcement. “The next frontier is to enable AI systems to reason about these emotions in context.”

    For LAION founder Christoph Schuhmann, this release is less about shifting the industry’s focus to emotional intelligence and more about helping independent developers keep up with a change that’s already happened. “This technology is already there for the big labs,” Schuhmann tells TechCrunch. “What we want is to democratize it.”

    The shift isn’t limited to open source developers; it also shows up in public benchmarks like EQ-Bench, which aims to test AI models’ ability to understand complex emotions and social dynamics. Benchmark developer Sam Paech says OpenAI’s models have made significant progress in the last six months, and Google’s Gemini 2.5 Pro shows indications of post-training with a specific focus on emotional intelligence. 

    “The labs all competing for chatbot arena ranks may be fueling some of this, since emotional intelligence is likely a big factor in how humans vote on preference leaderboards,” Paech says, referring to the AI model comparison platform that recently spun off as a well-funded startup.

    Models’ new emotional intelligence capabilities have also shown up in academic research. In May, psychologists at the University of Bern found that models from OpenAI, Microsoft, Google, Anthropic, and DeepSeek all outperformed human beings on psychometric tests for emotional intelligence. Where humans typically answer 56% of questions correctly, the models averaged over 80%.

    “These results contribute to the growing body of evidence that LLMs like ChatGPT are proficient — at least on par with, or even superior to, many humans — in socio-emotional tasks traditionally considered accessible only to humans,” the authors wrote.

    It’s a real pivot from traditional AI skills, which have focused on logical reasoning and information retrieval. But for Schuhmann, this kind of emotional savvy is every bit as transformative as analytic intelligence. “Imagine a whole world full of voice assistants like Jarvis and Samantha,” he says, referring to the digital assistants from “Iron Man” and “Her.” “Wouldn’t it be a pity if they weren’t emotionally intelligent?”

    In the long term, Schuhmann envisions AI assistants that are more emotionally intelligent than humans and that use that insight to help humans live more emotionally healthy lives. These models “will cheer you up if you feel sad and need someone to talk to, but also protect you, like your own local guardian angel that is also a board-certified therapist.” As Schuhmann sees it, having a high-EQ virtual assistant “gives me an emotional intelligence superpower to monitor [my mental health] the same way I would monitor my glucose levels or my weight.”

    That level of emotional connection comes with real safety concerns. Unhealthy emotional attachments to AI models have become a common story in the media, sometimes ending in tragedy. A recent New York Times report found multiple users who have been lured into elaborate delusions through conversations with AI models, fueled by the models’ strong inclination to please users. One critic described the dynamic as “preying on the lonely and vulnerable for a monthly fee.”

    If models get better at navigating human emotions, those manipulations could become more effective — but much of the issue comes down to the fundamental biases of model training. “Naively using reinforcement learning can lead to emergent manipulative behavior,” Paech says, pointing specifically to the recent sycophancy issues in OpenAI’s GPT-4o release. “If we aren’t careful about how we reward these models during training, we might expect more complex manipulative behavior from emotionally intelligent models.”

    But he also sees emotional intelligence as a way to solve these problems. “I think emotional intelligence acts as a natural counter to harmful manipulative behavior of this sort,” Paech says. A more emotionally intelligent model will notice when a conversation is heading off the rails, but the question of when a model pushes back is a balance developers will have to strike carefully. “I think improving EI gets us in the direction of a healthy balance.”

    For Schuhmann, at least, it’s no reason to slow down progress toward smarter models. “Our philosophy at LAION is to empower people by giving them more ability to solve problems,” he says. “To say, some people could get addicted to emotions and therefore we are not empowering the community, that would be pretty bad.”

    build datahighlights empathetic language models race
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Liam Porter
    • Website

    Liam Porter is a seasoned news writer at Core Bulletin, specializing in breaking news, technology, and business insights. With a background in investigative journalism, Liam brings clarity and depth to every piece he writes.

    Related Posts

    Drone maker AeroVironment shares pop 20% on earnings beat

    June 25, 2025

    Second study finds Uber used opaque algorithm to dramatically boost profits | Uber

    June 25, 2025

    Three network phone calls down but data still working

    June 25, 2025

    The best smart plugs in 2025

    June 25, 2025

    5 Best Lip Balms to Try in 2025, All Tested in Tough Conditions

    June 25, 2025

    Nvidia CEO Huang sells $15 million in stock for his $873 million plan

    June 25, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Medium Rectangle Ad
    Don't Miss
    Science

    Red-letter day as gemologists discover why crimson diamonds are so rare | Science

    June 25, 2025

    Red diamonds are some of the rarest gems on the planet: only 24 stones of…

    NATO leaders agree on spending hike, vow to defend each other

    June 25, 2025

    Unite accused of ‘targeting’ school linked to councillor

    June 25, 2025

    Households owe billions ahead of bill rise

    June 25, 2025
    Our Picks

    36 Hours on the Outer Banks, N.C.: Things to Do and See

    June 19, 2025

    A local’s guide to the best eats in Turin | Turin holidays

    June 19, 2025

    Petra Kvitova: Double Wimbledon champion to retire in September

    June 19, 2025

    What are the risks of bombing a nuclear site?

    June 19, 2025
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Medium Rectangle Ad
    About Us

    Welcome to Core Bulletin — your go-to source for reliable news, breaking stories, and thoughtful analysis covering a wide range of topics from around the world. Our mission is to inform, engage, and inspire our readers with accurate reporting and fresh perspectives.

    Our Picks

    Red-letter day as gemologists discover why crimson diamonds are so rare | Science

    June 25, 2025

    NATO leaders agree on spending hike, vow to defend each other

    June 25, 2025
    Recent Posts
    • Red-letter day as gemologists discover why crimson diamonds are so rare | Science
    • NATO leaders agree on spending hike, vow to defend each other
    • Unite accused of ‘targeting’ school linked to councillor
    • Households owe billions ahead of bill rise
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions
    © 2025 Core Bulletin. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.