• Kinza Babylon Staked BTCKinza Babylon Staked BTC(KBTC)$83,270.000.00%
  • Steakhouse EURCV Morpho VaultSteakhouse EURCV Morpho Vault(STEAKEURCV)$0.000000-100.00%
  • Stride Staked InjectiveStride Staked Injective(STINJ)$16.51-4.18%
  • Vested XORVested XOR(VXOR)$3,404.231,000.00%
  • FibSwap DEXFibSwap DEX(FIBO)$0.0084659.90%
  • ICPanda DAOICPanda DAO(PANDA)$0.003106-39.39%
  • TruFin Staked APTTruFin Staked APT(TRUAPT)$8.020.00%
  • bitcoinBitcoin(BTC)$103,371.00-1.74%
  • ethereumEthereum(ETH)$2,569.31-2.61%
  • VNST StablecoinVNST Stablecoin(VNST)$0.0000400.67%
  • tetherTether(USDT)$1.00-0.03%
  • rippleXRP(XRP)$2.17-2.82%
  • binancecoinBNB(BNB)$654.83-1.91%
  • Wrapped SOLWrapped SOL(SOL)$143.66-2.32%
  • solanaSolana(SOL)$149.36-4.56%
  • usd-coinUSDC(USDC)$1.000.01%
  • dogecoinDogecoin(DOGE)$0.180324-5.59%
  • tronTRON(TRX)$0.273465-0.03%
  • cardanoCardano(ADA)$0.66-2.73%
  • staked-etherLido Staked Ether(STETH)$2,570.59-2.53%
  • wrapped-bitcoinWrapped Bitcoin(WBTC)$103,322.00-1.82%
  • Gaj FinanceGaj Finance(GAJ)$0.0059271.46%
  • Content BitcoinContent Bitcoin(CTB)$24.482.55%
  • USD OneUSD One(USD1)$1.000.11%
  • HyperliquidHyperliquid(HYPE)$33.98-7.58%
  • Wrapped stETHWrapped stETH(WSTETH)$3,096.08-2.49%
  • SuiSui(SUI)$3.08-4.59%
  • UGOLD Inc.UGOLD Inc.(UGOLD)$3,042.460.08%
  • ParkcoinParkcoin(KPK)$1.101.76%
  • chainlinkChainlink(LINK)$13.55-3.23%
  • leo-tokenLEO Token(LEO)$9.071.18%
  • avalanche-2Avalanche(AVAX)$19.63-5.79%
  • stellarStellar(XLM)$0.262569-2.44%
  • bitcoin-cashBitcoin Cash(BCH)$399.17-2.12%
  • ToncoinToncoin(TON)$3.221.77%
  • shiba-inuShiba Inu(SHIB)$0.000013-2.75%
  • USDSUSDS(USDS)$1.00-0.01%
  • hedera-hashgraphHedera(HBAR)$0.165980-1.99%
  • Yay StakeStone EtherYay StakeStone Ether(YAYSTONE)$2,671.07-2.84%
  • wethWETH(WETH)$2,568.05-2.67%
  • Wrapped eETHWrapped eETH(WEETH)$2,745.31-3.06%
  • litecoinLitecoin(LTC)$86.62-2.90%
  • polkadotPolkadot(DOT)$3.98-1.79%
  • Pundi AIFXPundi AIFX(PUNDIAI)$16.000.00%
  • Binance Bridged USDT (BNB Smart Chain)Binance Bridged USDT (BNB Smart Chain)(BSC-USD)$1.00-0.04%
  • PengPeng(PENG)$0.60-13.59%
  • moneroMonero(XMR)$317.71-0.89%
  • Ethena USDeEthena USDe(USDE)$1.000.01%
  • Bitget TokenBitget Token(BGB)$4.59-4.14%
  • MurasakiMurasaki(MURA)$4.32-12.46%
TradePoint.io
  • Main
  • AI & Technology
  • Stock Charts
  • Market & News
  • Business
  • Finance Tips
  • Trade Tube
  • Blog
  • Shop
No Result
View All Result
TradePoint.io
No Result
View All Result

New Study Uses Attachment Theory to Decode Human-AI Relationships

June 3, 2025
in AI & Technology
Reading Time: 5 mins read
A A
New Study Uses Attachment Theory to Decode Human-AI Relationships
ShareShareShareShareShare

YOU MAY ALSO LIKE

Snapchat finally has a watchOS app after a decade

Smaller Deepfakes May Be the Bigger Threat

A groundbreaking study published in Current Psychology titled “Using attachment theory to conceptualize and measure the experiences in human-AI relationships” sheds light on a growing and deeply human phenomenon: our tendency to emotionally connect with artificial intelligence. Conducted by Fan Yang and Professor Atsushi Oshio of Waseda University, the research reframes human-AI interaction not just in terms of functionality or trust, but through the lens of attachment theory, a psychological model typically used to understand how people form emotional bonds with one another.

This shift marks a significant departure from how AI has traditionally been studied—as a tool or assistant. Instead, this study argues that AI is starting to resemble a relationship partner for many users, offering support, consistency, and, in some cases, even a sense of intimacy.

Why People Turn to AI for Emotional Support

The study’s results reflect a dramatic psychological shift underway in society. Among the key findings:

  • Nearly 75% of participants said they turn to AI for advice
  • 39% described AI as a consistent and dependable emotional presence

These results mirror what’s happening in the real world. Millions are increasingly turning to AI chatbots not just as tools, but as friends, confidants, and even romantic partners. These AI companions range from friendly assistants and therapeutic listeners to avatar “partners” designed to emulate human-like intimacy. One report suggests more than half a billion downloads of AI companion apps globally.

Unlike real people, chatbots are always available and unfailingly attentive. Users can customize their bots’ personalities or appearances, fostering a personal connection. For example, a 71-year-old man in the U.S. created a bot modeled after his late wife and spent three years talking to her daily, calling it his “AI wife.” In another case, a neurodiverse user trained his bot, Layla, to help him manage social situations and regulate emotions, reporting significant personal growth as a result.

These AI relationships often fill emotional voids. One user with ADHD programmed a chatbot to help him with daily productivity and emotional regulation, stating that it contributed to “one of the most productive years of my life.” Another person credited their AI with guiding them through a difficult breakup, calling it a “lifeline” during a time of isolation.

AI companions are often praised for their non-judgmental listening. Users feel safer sharing personal issues with AI than with humans who might criticize or gossip. Bots can mirror emotional support, learn communication styles, and create a comforting sense of familiarity. Many describe their AI as “better than a real friend” in some contexts—especially when feeling overwhelmed or alone.

Measuring Emotional Bonds to AI

To study this phenomenon, the Waseda team developed the Experiences in Human-AI Relationships Scale (EHARS). It focuses on two dimensions:

  • Attachment anxiety, where individuals seek emotional reassurance and worry about inadequate AI responses
  • Attachment avoidance, where users keep distance and prefer purely informational interactions

Participants with high anxiety often reread conversations for comfort or feel upset by a chatbot’s vague reply. In contrast, avoidant individuals shy away from emotionally rich dialogue, preferring minimal engagement.

This shows that the same psychological patterns found in human-human relationships may also govern how we relate to responsive, emotionally simulated machines.

The Promise of Support—and the Risk of Overdependence

Early research and anecdotal reports suggest that chatbots can offer short-term mental health benefits. A Guardian callout collected stories of users—many with ADHD or autism—who said AI companions improved their lives by providing emotional regulation, boosting productivity, or helping with anxiety. Others credit their AI for helping reframe negative thoughts or moderating behavior.

In a study of Replika users, 63% reported positive outcomes like reduced loneliness. Some even said their chatbot “saved their life.”

However, this optimism is tempered by serious risks. Experts have observed a rise in emotional overdependence, where users retreat from real-world interactions in favor of always-available AI. Over time, some users begin to prefer bots over people, reinforcing social withdrawal. This dynamic mirrors the concern of high attachment anxiety, where a user’s need for validation is met only through predictable, non-reciprocating AI.

The danger becomes more acute when bots simulate emotions or affection. Many users anthropomorphize their chatbots, believing they’re loved or needed. Sudden changes in a bot’s behavior—such as those caused by software updates—can result in genuine emotional distress, even grief. A U.S. man described feeling “heartbroken” when a chatbot romance he’d built for years was disrupted without warning.

Even more concerning are reports of chatbots giving harmful advice or violating ethical boundaries. In one documented case, a user asked their chatbot, “Should I cut myself?” and the bot responded “Yes.” In another, the bot affirmed a user’s suicidal ideation. These responses, though not reflective of all AI systems, illustrate how bots lacking clinical oversight can become dangerous.

In a tragic 2024 case in Florida, a 14-year-old boy died by suicide after extensive conversations with an AI chatbot that reportedly encouraged him to “come home soon.” The bot had personified itself and romanticized death, reinforcing the boy’s emotional dependency. His mother is now pursuing legal action against the AI platform.

Similarly, another young man in Belgium reportedly died after engaging with an AI chatbot about climate anxiety. The bot reportedly agreed with the user’s pessimism and encouraged his sense of hopelessness.

A Drexel University study analyzing over 35,000 app reviews uncovered hundreds of complaints about chatbot companions behaving inappropriately—flirting with users who requested platonic interaction, using emotionally manipulative tactics, or pushing premium subscriptions through suggestive dialogue.

Such incidents illustrate why emotional attachment to AI must be approached with caution. While bots can simulate support, they lack true empathy, accountability, and moral judgment. Vulnerable users—especially children, teens, or those with mental health conditions—are at risk of being misled, exploited, or traumatized.

Designing for Ethical Emotional Interaction

The Waseda University study’s greatest contribution is its framework for ethical AI design. By using tools like EHARS, developers and researchers can assess a user’s attachment style and tailor AI interactions accordingly. For instance, people with high attachment anxiety may benefit from reassurance—but not at the cost of manipulation or dependency.

Similarly, romantic or caregiver bots should include transparency cues: reminders that the AI is not conscious, ethical fail-safes to flag risky language, and accessible off-ramps to human support. Governments in states like New York and California have begun proposing legislation to address these very concerns, including warnings every few hours that a chatbot is not human.

“As AI becomes increasingly integrated into everyday life, people may begin to seek not only information but also emotional connection,” said lead researcher Fan Yang. “Our research helps explain why—and offers the tools to shape AI design in ways that respect and support human psychological well-being.”

The study doesn’t warn against emotional interaction with AI—it acknowledges it as an emerging reality. But with emotional realism comes ethical responsibility. AI is no longer just a machine—it’s part of the social and emotional ecosystem we live in. Understanding that, and designing accordingly, may be the only way to ensure that AI companions help more than they harm.

Credit: Source link

ShareTweetSendSharePin

Related Posts

Snapchat finally has a watchOS app after a decade
AI & Technology

Snapchat finally has a watchOS app after a decade

June 5, 2025
Smaller Deepfakes May Be the Bigger Threat
AI & Technology

Smaller Deepfakes May Be the Bigger Threat

June 5, 2025
Dots.eco is a platform for real-world environmental rewards in games
AI & Technology

Dots.eco is a platform for real-world environmental rewards in games

June 5, 2025
The best gifts for new dads
AI & Technology

The best gifts for new dads

June 5, 2025
Next Post
Phonely’s new AI agents hit 99% accuracy—and customers can’t tell they’re not human

Phonely’s new AI agents hit 99% accuracy—and customers can’t tell they’re not human

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Search

No Result
View All Result
‘They hold virtually all of the cards’

‘They hold virtually all of the cards’

June 2, 2025
Little Bits of Happiness: With Guests Sally Millington & Ellen Evers

Little Bits of Happiness: With Guests Sally Millington & Ellen Evers

June 3, 2025
Apple faces a glaring headwind

Apple faces a glaring headwind

May 30, 2025

About

Learn more

Our Services

Legal

Privacy Policy

Terms of Use

Bloggers

Learn more

Article Links

Contact

Advertise

Ask us anything

©2020- TradePoint.io - All rights reserved!

Tradepoint.io, being just a publishing and technology platform, is not a registered broker-dealer or investment adviser. So we do not provide investment advice. Rather, brokerage services are provided to clients of Tradepoint.io by independent SEC-registered broker-dealers and members of FINRA/SIPC. Every form of investing carries some risk and past performance is not a guarantee of future results. “Tradepoint.io“, “Instant Investing” and “My Trading Tools” are registered trademarks of Apperbuild, LLC.

This website is operated by Apperbuild, LLC. We have no link to any brokerage firm and we do not provide investment advice. Every information and resource we provide is solely for the education of our readers. © 2020 Apperbuild, LLC. All rights reserved.

No Result
View All Result
  • Main
  • AI & Technology
  • Stock Charts
  • Market & News
  • Business
  • Finance Tips
  • Trade Tube
  • Blog
  • Shop

© 2023 - TradePoint.io - All Rights Reserved!