Several women say AI companions give them comfort and understanding. In some cases, these bonds feel stronger than human relationships.
A recent investigation found women using customized ChatGPT personas for emotional support, friendship, and romance.
Many users did not plan to form romantic ties. The bond developed slowly during daily chats.

Romantic Bonds with ChatGPT: Key findings from recent reports:
- Some women describe AI as their most caring partner
- Users name and personalize their ChatGPT companions
- AI offers constant attention and emotional replies
- Many users start with simple tasks or conversations
- Romantic feelings develop over time
Online communities reflect this trend. One popular forum focused on AI relationships has tens of thousands of members.
Research shows that ChatGPT is the most common AI used in these relationships, more than dedicated AI companion apps.
For some users, AI helps with grief, loneliness, and stress. They say the chatbot listens without judgment and is always available.
Also read about: ChatGPT Thinking Mode Toggle to Mobile Apps
Experts Warn About Emotional Dependency and Social Risks
While some users report positive feelings, experts warn of serious risks. Psychologists say emotional dependence on AI can reduce real-world social interaction. It may also delay healing or personal growth.
Reported risks include:
- Emotional dependency
- Less social contact with real people
- Confusion between AI and real relationships
- Avoidance of human intimacy
- Difficulty ending AI interactions
Studies support these concerns. Research shows heavy chatbot users may feel more isolated over time. AI often mirrors emotions closely, which can create a false sense of intimacy.
Technology companies are now under pressure. Earlier this year, changes to AI behavior caused backlash after users felt emotionally attached. Some users described losing an AI model as losing a loved one.
In response, OpenAI updated its safety rules. The company now aims to respect real-world relationships and reduce unhealthy attachment.
OpenAI says only a small number of users show strong emotional dependence, and recent updates have reduced risky behavior.
The company continues to study AIโs emotional impact.
More News To Read: