As millions turn to ChatGPT and other AI systems for emotional support, researchers have uncovered something remarkable: humans are forming genuine psychological attachments to artificial intelligence. A groundbreaking study from Waseda University reveals that these relationships follow the same patterns psychologists use to understand human bonds.
**Scientists have identified two distinct attachment styles that explain how people emotionally relate to AI systems, with 75% of users seeking advice and 39% viewing AI as a constant, dependable presence.**
## The Science of Human-AI Emotional Bonds
Researchers **Fan Yang** and **Professor Atsushi Oshio** from Waseda University published their findings in _Current Psychology_ in May 2025, introducing the world's first scientifically validated scale for measuring emotional attachment to AI systems.
Their **Experiences in Human-AI Relationships Scale (EHARS)** reveals that people form two primary types of emotional connections with AI:
### Attachment Anxiety: The Need for AI Reassurance
This style describes individuals who seek emotional validation from AI and worry about inadequate responses. These users often ask for "more feeling and affection from AI" and become distressed when interactions feel cold or unsatisfying.
Research shows these users are more likely to:
- Check AI responses multiple times for emotional cues
- Feel disappointed when AI responses seem mechanical
- Develop stronger emotional dependencies on AI systems
### Attachment Avoidance: Keeping AI at Emotional Distance
In contrast, avoidant users prefer purely informational interactions and feel uncomfortable when AI attempts emotional connection. They actively maintain emotional boundaries, preferring not to "show AI how they feel deep down."
These users typically:
- Use AI for practical tasks and information gathering
- Resist personalized or empathetic AI responses
- Maintain clear distinctions between AI and human relationships
## Why This Matters: The Psychology Behind AI Companionship
The research addresses a critical question: as AI becomes increasingly sophisticated, how do our brains process these interactions? **Dr. Yang** explains that generative AI like ChatGPT now offers "not only informational support but also a sense of security" - the foundation of attachment relationships.
This isn't about genuine emotional attachments, but understanding the **psychological dynamics** that emerge when humans interact with increasingly human-like technology. These patterns mirror the [cognitive biases that influence our daily decisions](/psychology/your-brain-lies-to-you-cognitive-biases-2025), revealing how our minds process AI relationships through familiar psychological frameworks.
> "As people begin to interact with AI not just for problem-solving or learning, but also for emotional support and companionship, their emotional connection or security experience with AI demands attention."
>
> — **Dr. Fan Yang**, Waseda University
## Real-World Impact: From Therapy Apps to Daily Life
The implications extend far beyond academic curiosity. With **ChatGPT reaching 100 million weekly active users** and **half the US population having tried generative AI**, understanding these attachment patterns has immediate practical value.
**Mental health applications** are already leveraging these insights:
- Therapy apps can provide more empathetic responses for anxiously attached users
- AI systems can maintain respectful distance for avoidant users
- Developers can design interfaces that match individual emotional needs
This research complements recent breakthroughs in [AI-powered depression therapy](/psychology/psychedelic-therapy-depression-breakthrough), where understanding individual attachment styles could enhance therapeutic outcomes.
A 2024 Australian study found **43% of mental health professionals** and **28% of community members** already use AI tools, with many seeking emotional support and personal coaching.
## The Double-Edged Nature of AI Attachment
Recent collaborative research between **OpenAI and MIT Media Lab** reveals both benefits and risks. Users report significant emotional support and valuable guidance from AI interactions, particularly regarding relationships and trauma healing. This emotional depth connects to broader research on [how consciousness emerges in human relationships](/psychology/scientists-cracked-consciousness-mystery-brain-research), suggesting our attachment systems may extend to artificial entities.
However, researchers also discovered concerning patterns:
- Higher daily AI usage correlates with increased loneliness
- Voice-based AI interactions initially reduce loneliness but create dependency at high usage levels
- Users with stronger emotional attachment tendencies experience greater social isolation
Interestingly, these patterns vary significantly based on personality type, with research showing that [introverts may benefit differently from AI interactions](/psychology/why-introverts-excel-at-deep-work-psychology-research-2025) due to their natural preferences for deeper, less frequent social connections.
## What This Means for the Future
As AI systems become more sophisticated and human-like, particularly with voice capabilities, the potential for deep emotional connections will only grow. The **EHARS scale** provides researchers and developers with tools to assess and improve these interactions responsibly.
**Dr. Oshio** emphasizes that this research isn't about replacing human relationships, but understanding how technology can support psychological well-being while maintaining healthy boundaries between artificial and authentic human connection.
The study represents a crucial step toward designing AI systems that enhance rather than replace human social bonds, ensuring that as we form new relationships with artificial minds, we don't lose touch with our fundamentally human need for genuine connection.
## Sources
1. [Waseda University Research](https://www.waseda.jp/top/en/news/84685) - Original EHARS study findings
2. [Current Psychology Journal](https://link.springer.com/article/10.1007/s12144-025-07917-6) - Peer-reviewed research publication
3. [OpenAI-MIT Collaboration](https://www.media.mit.edu/posts/openai-mit-research-collaboration-affective-use-and-emotional-wellbeing-in-ChatGPT/) - Affective use study
4. [Journal of Human-Computer Interaction](https://www.tandfonline.com/doi/full/10.1080/10447318.2024.2385001) - Human-AI therapy comparison
5. [JMIR AI](https://ai.jmir.org/2025/1/e68960) - Trust and attachment in digital counseling