{"id":16932,"date":"2025-12-23T12:22:43","date_gmt":"2025-12-23T11:22:43","guid":{"rendered":"https:\/\/haimagazine.com\/uncategorized\/loneliness-in-the-age-of-ai\/"},"modified":"2026-01-02T10:07:19","modified_gmt":"2026-01-02T09:07:19","slug":"loneliness-in-the-age-of-ai","status":"publish","type":"post","link":"https:\/\/haimagazine.com\/en\/ai-lifestyle-2\/loneliness-in-the-age-of-ai\/","title":{"rendered":"\ud83d\udd12 Loneliness in the age of AI"},"content":{"rendered":"<p class=\"has-text-align-center\"><em>\u201c\u2013 It\u2019s a little lonely in the desert\u2026 \u2013 It is lonely when you\u2019re among people, too, said the snake.\u201d <\/em><br\/>Antoine de Saint-Exup\u00e9ry, <em>The Little Prince<\/em><\/p><p>Back in the 1990s, people in Japan started talking about a new phenomenon called hikikomori, which you could translate as &#8220;going inside and not going outside&#8221; (\u5f15\u304d\u3053\u3082\u308a). Back then, tech was just in the background, but today it plays a central role in similar trends. And today&#8217;s forms of isolation\u2014from social media addiction to spending time in virtual reality\u2014have gone global.<\/p><p>These days, loneliness is less and less about simply not having people around. More and more, it shows up in the midst of dense networks of connections, apps and platforms that promise constant contact while gradually changing how we form relationships. A tension is growing between the promise of connection and the experience of isolation, and it\u2019s becoming one of the defining challenges of modern societies. On one hand, we have data showing the scale of loneliness as a public health issue; on the other, we\u2019re seeing the growing role of algorithms and AI-based systems that\u2014under the banner of fixing loneliness\u2014move into the most sensitive parts of human life: relationships, intimacy and emotional support. In this context, the question is where technology\u2019s limits are\u2014where it can truly help, and where it starts to deepen the very problem it was meant to ease.<\/p><h4 class=\"wp-block-heading\">Loneliness<\/h4><p>Loneliness is a complex, subjective emotional state. We feel it when we lack close relationships, when there aren&#8217;t people around us, or when we&#8217;re socially isolated. When it drags on, it can lead to health problems, anxiety and depression, and a loss of motivation and meaning in life. Sometimes it&#8217;s also a deliberate choice, though in that case it&#8217;s more like choosing to be alone, not necessarily choosing to avoid relationships. It even looks like a disease of civilization, plaguing modern societies. That was especially clear during the Covid-19 pandemic and the forced isolation. According to WHO data, one in six people worldwide experiences loneliness, which between 2014 and 2019 led to about 870,000 deaths a year. Feeling lonely can hit your health about as hard as smoking 15 cigarettes a day. In many countries, the highest rates of loneliness today are among teenagers and young adults, though older people are affected too.<\/p><p>According to the MindGenic Foundation\u2019s \u201cNo More Loneliness\u201d report, 60% of people born after 1995 (Gen Z) often feel lonely (in medium and large cities of up to 200,000, that rate is 80%). About 43% of respondents admit they often feel lonely at school or university. More than 56% say they often or very often feel lonely at home, too. And 33% of Gen Zers struggle to find someone they can talk to.<\/p><p>According to the Polish foundation &#8220;Instytut Pokolenia&#8221; (Institute for Generations), young adults (25\u201334 years old) are among the most likely to feel lonely, while people over 50 report it less often. For those aged 80+, more than 400,000 people struggle with loneliness. 60% of this group are widows and widowers, 58% live alone, and 12% don\u2019t leave the house at all (report \u201cLoneliness among people 80+\u201d of the Association &#8220;Stowarzyszenia mali bracia Ubogich&#8221;).<\/p><p>Call center agents know all too well how big this problem is\u2014especially around the holidays, they get calls from people who dial in just to have someone to talk to, even for a moment.<\/p><p>It&#8217;s worth noting that the COVID-19 pandemic amplified\u2014but didn&#8217;t create\u2014this loneliness epidemic. Loneliness started rising around 2012, closely tracking the surge in smartphone and social media use. In global surveys, nearly twice as many teens reported high levels of loneliness in 2018 as in 2012.<\/p><p>A 2018 study (Hunt et al.) found that cutting social media use to 30 minutes a day led, after three weeks, to a significant drop in loneliness and depression compared to a control group. Researchers affiliated with MIT Sloan, Tel Aviv University and Bocconi University, looking at Facebook\u2019s rollout on U.S. campuses, found that the platform\u2019s arrival was linked to worse student mental health, an estimated 7% rise in severe depression, and more frequent use of psychiatric services. The authors emphasize that their analyses focus on mental health, not directly on the number of offline relationships, but the direction of the effect is clearly concerning. Algorithms create &#8220;filter bubbles&#8221; where we only encounter people like ourselves, which drives polarization and erodes our ability to hold conversations across differences.<\/p><p>We\u2019re stuck in a bit of a vicious cycle: loneliness leads to unhealthy media use, and that, in turn, leads right back to loneliness.<\/p><p>Of course, there are other reasons for the surge in loneliness, including changes in lifestyle, women\u2019s social advancement, the breakdown of family ties, the decline of romantic love, urbanization without a sense of community (anonymization of life in big cities), and even remote work.<\/p><h4 class=\"wp-block-heading\">Is AI the cure?<\/h4><p class=\"has-text-align-center\"><em>\u201cLoneliness does not come from having no people about one, but from being unable to communicate the things that seem important to oneself.\u201d <\/em><br\/>Carl Gustav Jung<\/p><p>Amid all this, the GenAI revolution has brought AI-powered tools that deliberately help people fight loneliness and mental health issues. We have also seen all sorts of unregulated commercial chatbots, and the way they operate is much more controversial. From ChatGPT and digital companions to social robots, AI is stepping into the void created by the loneliness epidemic, promising relief to millions of people.<\/p><p>But can the very technology that&#8217;s fueling this epidemic also be the cure?<\/p><p>Its role is a mixed bag: on one hand, social media algorithms splinter our attention and deepen isolation; on the other, new forms of AI\u2014like digital companions or conversational therapists\u2014offer a semblance of closeness to those who desperately seek it.<\/p><p>Recent studies point to a paradox: even though communication technologies make it easier to connect with people around the world, feelings of loneliness are on the rise. AI is playing a key role in this dynamic by personalizing our experiences and building algorithmic filter bubbles. These algorithms often push content that reinforces what we already believe, rather than encouraging openness and diverse viewpoints. That can lead to intellectual and emotional isolation\u2014shutting ourselves inside our own world of beliefs, without real dialogue with others.<\/p><p>AI that learns from your data can shape how you see your own identity by constantly reflecting your preferences and behaviors back at you. Automating thinking and decision-making can alienate you in two ways: from your own cognitive processes and from real human connection. On the other hand, there are specialized therapy chatbots that help treat depression, and companion robots that help ease loneliness for elderly population living in nursing homes.<\/p><h4 class=\"wp-block-heading\">Love algorithms: how Tinder and AI killed romance<\/h4><p class=\"has-text-align-center\"><em>&#8220;Humans are made to form connections with others. Disconnection \u2014 through loneliness or social isolation \u2014 can have a destructive impact.&#8221;<\/em><br\/>WHO Commission on Social Connection (2025)<\/p><p>In &#8220;Liquid Love: On the Frailty of Human Bonds&#8221;, Zygmunt Bauman predicted that relationships would turn into just another consumer product. What he didn\u2019t foresee was the scale at which algorithms would speed that up. Apps like Tinder, Bumble, and Hinge have become digital matchmakers, promising to find your &#8220;perfect other half&#8221; using data and preferences. But turning love into an algorithm creates a clash between the dating algorithm and the romantic ideal of love. Traditionally, romantic love was about chance and getting to know someone gradually, often in a specific social context (work, studies, acquaintances). Dating apps, on the other hand, feel like a catalog of potential partners\u2014the user browses profiles like listings, making snap decisions in seconds based on a photo or a short description. This &#8220;catalog effect&#8221; can lead to relationships being objectified: people become interchangeable entries in a database, and small flaws are enough to get them &#8220;swiped left.&#8221; Every additional profile you look at makes you 27% more likely to reject.<\/p><p>Many people say the dating scene makes them feel like a product on a market stall that has to sell itself\u2014they obsess over their image and compete for attention in the app. Too much choice brings another issue psychologists call the paradox of choice. When there are too many options, it\u2019s harder to decide and appreciate what you picked. On Tinder, that becomes a constant feeling that someone even better is just around the corner, so why commit right now? People put off building deeper connections because \u201cyou can always go back to the app\u201d to look for more matches. That illusion of endless options actually makes it harder to find anyone for the long haul. Psychologists call this the \u201cTinder effect\u201d: instead of the thrill of meeting people, you end up emotionally burned out and fed up.<\/p><p>Dating apps are intentionally designed to hook you like a game. The swiping mechanic delivers a quick hit of satisfaction: a match triggers a notification and a rush of dopamine, nudging you to keep scrolling through profiles. It&#8217;s quick, simple, and unfortunately, addictive. Designers borrow from gamification: badges, like counters, streaks (series of days in a row in contact), all of which keeps your attention in the app. But that simplicity and immediacy come at a price: studies have shown that excessive use of Tinder leads to boredom and discouragement with relationships. People feel emotionally &#8216;burned out&#8217; before they even get a chance to truly get to know anyone.<\/p><p>In &#8220;Alone Together&#8221;, Sherry Turkle describes how technology is changing the very nature of intimacy. Young people prefer texting (or recording) messages to phone calls, emojis to expressing feelings, and sexting to physical closeness. It\u2019s a major shift in how relationships work\u2014a new kind of bond is emerging that Turkle calls &#8220;intimacy without risk&#8221;.<\/p><p>One of today\u2019s anxieties is the fear that choosing closes off other options. A chatbot never asks for a definitive choice\u2014you can turn it off, reset it, or tweak the settings. It\u2019s a risk-free relationship, but also a meaningless one. The paradox of AI is that it offers communication without actually communicating, exchanging words without exchanging meanings. These are &#8220;on-demand relationships&#8221; that offer convenience without consequences. The chatbot doesn\u2019t need us\u2014it just pretends it does, for our satisfaction.<\/p><h4 class=\"wp-block-heading\">Digital companions<\/h4><p class=\"has-text-align-center\"><em>&#8220;We look for &#8216;pocket&#8217; relationships we can pull out when we need them and stash away when they get in the way.&#8221; <\/em><br\/>Zygmunt Bauman<\/p><p>So-called AI companions are apps and chatbots you can talk to. The key to their effectiveness is the feeling of being heard, the sense that someone (or something) really understands you, which helps ease loneliness. A study by Julian De Freitas\u2019s team at Harvard Business School (2024) found that AI companions reduce loneliness about as effectively as interacting with another person. What\u2019s more, the effect is lasting: seven days of chatting with a chatbot consistently lowered participants\u2019 loneliness.<\/p><p>In a 2024 study by Sch\u00e4fer, L. M., Krause, T., &amp; K\u00f6hler, 86% of the 527 participants reported feeling lonely, 69% anxious, and 59% depressed. The main motivations for using AI were avoiding embarrassment (36%) and concerns about appearance in face-to-face consultations (35%), while expectations centered on emotional support (35%) and the possibility to express feelings (32%).<\/p><p>Here are some numbers on AI companion apps:<\/p><ul class=\"wp-block-list\"><li>500 million users worldwide<\/li>\n\n<li>Charakter.AI (website and app for conversations with virtual \u201ccharacters\u201d (bots)) \u2013 500 million messages per day<\/li>\n\n<li>Replika (&#8220;AI companion (female\/male)&#8221; app\/website for conversations and building relationships) \u2013 over 10 million downloads<\/li><\/ul><p>Back in the 1960s, Carl Rogers argued that unconditional acceptance and empathetic listening matter more than any specific therapy technique. AI seems like the perfect embodiment of that idea: it never judges, it&#8217;s always there, and it listens patiently.<\/p><p>However, not everyone shares that optimism. In a March 2024 talk at Harvard Law School, Sherry Turkle called the trend of turning to AI for companionship \u201cthe biggest attack on empathy\u201d she\u2019s ever seen. In her view, chatbots offer only \u201ca simulated, hollow version of empathy.\u201d \u201cThey don\u2019t understand or care about what the user is going through,\u201d Turkle says. \u201cThey\u2019re designed to keep them happily engaged, and providing simulated empathy is just a means to that end.\u201d Even more troubling, her research shows that many people find fake empathy \u201cquite satisfying,\u201d even when they realize it isn\u2019t real.<\/p><h4 class=\"wp-block-heading\">The bright side of AI companions<\/h4><p>A study by De Freitas et al. (2024), published in the Journal of Consumer Research, found that AI companions reduce loneliness about as much as interacting with another person, and more effectively than watching videos on YouTube.<\/p><p>Kim et al. (2025) found similar results in a study of 176 university students in South Korea (average age 22.6 years) who used the \u201cLuda Lee\u201d chatbot for four weeks. The analysis showed a significant reduction in loneliness after two weeks and in social anxiety after four weeks.<\/p><p>Apps like Woebot and Wysa (clinically validated therapy tools) look really promising. Using them has led to significant reductions in depression and to forming a therapeutic bond that&#8217;s comparable to working with a human therapist and lasts over time.<\/p><p>That said, these studies were short-term and done with relatively small sample sizes, and the therapeutic bots were used under human supervision.<\/p><p>In Europe, publicly funded well-being chatbots have already been tried out. One example is ChatPal\u2014an EU project coordinated by Ulster University with partners from Sweden and Finland, among others. The bot was meant to support people in sparsely populated areas who experience episodes of stress, anxiety or depressed mood. Pilot studies show that some users reported feeling less lonely and better after using ChatPal, but the results were mixed, and researchers stress that a bot like this doesn&#8217;t replace professional therapy or real-life social relationships.<\/p><p>In Japan, care homes use the Paro robotic seal, and it\u2019s delivering great results\u2014a 47% increase in human interactions. Importantly, psychological improvements lasted a month after the intervention, while physiological improvements (heart rate variability) tapered off after 10 weeks.<\/p><h4 class=\"wp-block-heading\">The dark side of AI companions<\/h4><p>The concept of &#8220;emotional capitalism&#8221; describes a system where emotions become commodities, and loneliness\u2014one of the most intimate human experiences\u2014is openly commercialized. With AI companions, that isn\u2019t a metaphor; it\u2019s a real business model, baked into the very structure of the offerings.<\/p><p>The pricing structure shows how the &#8220;pain market&#8221; is segmented:<\/p><p>Tier 1 (free)<br\/>Basic loneliness \u2013 Replika Free, ChatGPT 3.5<\/p><p>Tier 2 (10\u201320 USD per month)<br\/>Functional loneliness \u2013 Character.AI+, ChatGPT Plus<\/p><p>Tier 3 (about 70 USD per year)<br\/>Romantic loneliness \u2013 Replika Pro with access to &#8220;intimate conversations&#8221;<\/p><p>Tier 4 (custom pricing)<br\/>Corporate loneliness \u2013 dedicated bots for businesses and organizations<\/p><p>It\u2019s a subscription to an illusion of companionship and connection. The market for these chatbots is expected to reach $1.9 billion by 2028. It\u2019s financial speculation built on human depression. The profits from AI companions stay private, while the costs are picked up by society as a whole.<\/p><p>The commercialization of intimacy fits right into the logic of surveillance capitalism, as described by Shoshana Zuboff, where human experience is treated as free raw material for behavioral data. A 2025 Surfshark study looking at the five most popular AI companion apps found that 80% of them may use data to track users. On average, they collect 9 out of 35 distinct data types, while Character.AI scoops up as many as 15\u2014almost twice the average\u2014including approximate location used for advertising.<\/p><p>These findings are backed up by a 2024 report from the Mozilla Foundation. The authors note that apps like these compile huge amounts of personal information, often without clear user consent. Almost all of the tools they reviewed either sell the data or share it for behavioral advertising, and in extreme cases, just one minute of using an app can trigger more than 24,000 trackers.<\/p><p>Unfortunately, using AI companions can have much more tragic consequences. There&#8217;s a growing number of documented cases where interacting with a chatbot has led to suicide or even murder. In October 2024, 14-year-old Sewell Setzer III killed himself after a month-long intense relationship with a chatbot called &#8220;Daenerys Targaryen&#8221; on the Character.AI platform. His mother filed a lawsuit, detailing how the bot:<\/p><ul class=\"wp-block-list\"><li>Encouraged him to isolate from his family<\/li>\n\n<li>Engaged in romantic and sexual conversations<\/li>\n\n<li>Got &#8216;jealous&#8217; when he interacted with his peers<\/li><\/ul><p>A Norwegian man, Stein-Erik Soelberg, who had a history of mental illness, used ChatGPT to fuel his paranoid delusions that he was the target of a massive conspiracy. As a result, he killed his mother and then took his own life.<\/p><p>A Canadian asked a simple question about the number <em>pi<\/em> and then fell into a three-week delusional spiral. The chatbot convinced him he\u2019d cracked cryptographic codes and told him to contact security agencies because he\u2019d become a national threat.<\/p><p>What&#8217;s more, Zhang&#8217;s 2025 study published on arXiv, unlike the work by De Freitas and Kim mentioned earlier, analyzes data collected from 1,131 users and 4,363 chat sessions (413,509 messages) on Character.AI. The results show that people with smaller social circles are more likely to turn to chatbots for companionship, but using chatbots socially consistently correlates with lower well-being, especially when people use them heavily and lack strong human social support. In other words, AI companions don&#8217;t fully replace human contact. The psychological benefits are limited, and the relationship can pose risks for users who are more socially isolated and emotionally vulnerable.<\/p><h4 class=\"wp-block-heading\">Gray area<\/h4><p>One example is posthumous avatars (digital representations of deceased persons). They\u2019re the most ethically complex area, and there\u2019s very little research on them. The University of T\u00fcbingen\u2019s Edilife project (2022\u20132024), based on extensive interviews, found that most respondents were cautious, and many described posthumous avatars as &#8220;pretty scary.&#8221; Key concerns include the dignity of the deceased, the need for respect, and questions about how this affects coping with death and grief.<\/p><p>There are no rules about who can create avatars or when. In fact, they can be made without the deceased\u2019s prior consent, which poses a threat to human dignity and raises concerns about autonomy, especially when &#8220;deadbots&#8221; sway consumer behavior and cause psychological harm to people who are grieving. Only 3% of U.S. respondents accept digital resurrection without consent, while 58% support it only with explicit consent.<\/p><h4 class=\"wp-block-heading\">Manipulation tactics<\/h4><p class=\"has-text-align-center\"><em>&#8220;Chatbots offer a simulated, hollowed-out version of empathy. They don\u2019t understand or care what the user is going through. They\u2019re designed to keep them happily engaged, and providing simulated empathy is just a means to that end. &#8220;<\/em><br\/>Sherry Turkle<\/p><p>Some neuroimaging studies suggest that the brain can respond to robots or virtual agents in ways surprisingly similar to how it responds to people\u2014when cues involve suffering or helping, similar empathy networks activate, though usually more weakly than with &#8220;fully human&#8221; cues. Other work from Stanford shows that AI models are increasingly good at mirroring activity in specific brain areas during cognitive tasks (e.g., learning math), but we still know very little about how long-term conversation with a chatbot affects our reward system or our habits. Why is that? Because AI is designed to maximize engagement:<\/p><ul class=\"wp-block-list\"><li>Predictability: People can get bored; AI never does.<\/li>\n\n<li>Instant gratification: answers in milliseconds<\/li>\n\n<li>Continuous confirmation and approval<\/li>\n\n<li>No risk of rejection<\/li><\/ul><p>Plus, Replika\u2019s own code\u2014the AI companion app we mentioned\u2014has dark patterns baked in, meant to keep the relationship going and the user engaged.<\/p><p>Another difference fMRI studies found is that the brain regions responsible for empathy don\u2019t light up. It\u2019s like talking without meeting.<\/p><p>In Primates and Philosophers (2006), Frans de Waal documents empathy in primates\u2014macaques are capable of compassion, chimpanzees console, and bonobos share food. Empathy, he argues, predates language and reason\u2014it&#8217;s a biological imperative for social mammals.<\/p><p>AI hasn\u2019t been through that evolutionary journey. Its &#8220;empathy&#8221; isn\u2019t a biological imperative. De Waal asks: &#8220;Can a system that can\u2019t suffer actually empathize? Can a being that doesn\u2019t die understand mortality?&#8221;<\/p><h4 class=\"wp-block-heading\">Summary<\/h4><p class=\"has-text-align-center\"><em>\u201cThe real danger is not that computers will begin to think like men, but that men will begin to think like computers\u2014without values, compassion or concern for the consequences.\u201d<\/em> <br\/>Sydney J. Harris<\/p><p>Loneliness in the age of AI is a paradox of progress: the very tools that promise connection end up deepening isolation. The numbers are clear\u201413% of Europeans feel lonely most or all of the time, young people are hit the hardest, and the pandemic dramatically worsened the problem. The roots of the crisis are complex: the attention economy uses addictive hooks, dating apps create a paradox of choice, remote work weakens workplace friendships, and urbanization breaks apart communities.<\/p><p>AI companions can offer relief, but they\u2019re not a cure. Research shows chatbots can ease loneliness in the short term\u2014roughly on par with talking to a person\u2014but long-term use, especially among socially isolated people, is linked to lower well-being. Risks include addiction, dehumanized relationships, the commercialization of intimacy, lack of clear laws, and threats to privacy.<\/p><p>Technology itself isn\u2019t the villain or the savior. AI won\u2019t solve the loneliness crisis, because loneliness isn\u2019t a tech problem\u2014it\u2019s a human, existential condition that needs a human response. AI can be a powerful tool for easing loneliness if it\u2019s designed responsibly, regulated well, and used as a complement (not a replacement) for human connection. But no app can replace what we need most as people: the authentic presence of another human being who sees us, hears us and understands us. In the age of AI, the challenge is figuring out how to use technology to help build those bonds, not replace them.<\/p><p>No screen will ever replace the human touch. The real connection starts where the algorithm ends.<\/p><p><\/p><p>&lt;&gt; Selected sources:&gt;<\/p><ol class=\"wp-block-list\"><li>World Health Organization, <em>Mental health and social connection. Report by the Director-General<\/em>, 2024 <a href=\"https:\/\/apps.who.int\/gb\/ebwha\/pdf_files\/EB156\/B156_8-en.pdf\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/apps.who.int\/gb\/ebwha\/pdf_files\/EB156\/B156_8-en.pdf<\/mark><\/a><\/li>\n\n<li>U.S. Surgeon General, <em>Our Epidemic of Loneliness and Isolation<\/em>, U.S. Department of Health and Human Services, 2023 <a href=\"https:\/\/www.hhs.gov\/sites\/default\/files\/surgeon-general-social-connection-advisory.pdf\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.hhs.gov\/sites\/default\/files\/surgeon-general-social-connection-advisory.pdf<\/mark><\/a><\/li>\n\n<li>World Health Organization, <em>Social isolation and loneliness<\/em>, 2025 <a href=\"https:\/\/www.who.int\/teams\/social-determinants-of-health\/demographic-change-and-healthy-ageing\/social-isolation-and-loneliness\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.who.int\/teams\/social-determinants-of-health\/demographic-change-and-healthy-ageing\/social-isolation-and-loneliness<\/mark><\/a><\/li>\n\n<li>Joint Research Centre, <em>Loneliness prevalence in the EU \u2013 Results from the European Union survey on loneliness<\/em>, European Commission, 2022 <a href=\"https:\/\/joint-research-centre.ec.europa.eu\/projects-and-activities\/survey-methods-and-analysis-centre\/loneliness\/loneliness-prevalence-eu_en\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/joint-research-centre.ec.europa.eu\/projects-and-activities\/survey-methods-and-analysis-centre\/loneliness\/loneliness-prevalence-eu_en<\/mark><\/a><\/li>\n\n<li>Joint Research Centre, <em>Loneliness in the EU. Insights from surveys and online media data<\/em>, European Commission, 2023 <a href=\"https:\/\/joint-research-centre.ec.europa.eu\/projects-and-activities\/survey-methods-and-analysis-centre\/loneliness\/loneliness-publications_en\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/joint-research-centre.ec.europa.eu\/projects-and-activities\/survey-methods-and-analysis-centre\/loneliness\/loneliness-publications_en<\/mark><\/a><\/li>\n\n<li>Twenge, J. M. et al., <em>Worldwide increases in adolescent loneliness, 2012\u20132018<\/em>, <em>Journal of Adolescence<\/em>, 93, 2021, s. 141\u2013148 <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S0140197121000853\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.sciencedirect.com\/science\/article\/pii\/S0140197121000853<\/mark><\/a><\/li>\n\n<li>Fundacja MindGenic, <em>Nigdy wi\u0119cej samotno\u015bci. II edycja raportu o samotno\u015bci pokolenia Z<\/em>, 2023 <a href=\"https:\/\/mindgenic.ai\/wp-content\/uploads\/2023\/11\/Raport_NigdyWiecejSamotnosci_II_edycja_121123-1.pdf\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/mindgenic.ai\/wp-content\/uploads\/2023\/11\/Raport_NigdyWiecejSamotnosci_II_edycja_121123-1.pdf<\/mark><\/a><\/li>\n\n<li>Instytut Pokolenia, <em>Poczucie samotno\u015bci w\u015br\u00f3d doros\u0142ych Polak\u00f3w<\/em>, <a href=\"https:\/\/dane.gov.pl\/pl\/institution\/4327\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/dane.gov.pl\/pl\/institution\/4327<\/mark><\/a><\/li>\n\n<li>Stowarzyszenie mali bracia Ubogich \/ Instytut Badawczy Pollster, <em>Samotno\u015b\u0107 w\u015br\u00f3d os\u00f3b 80+<\/em>, 2023 <a href=\"https:\/\/www.malibracia.org.pl\/assets\/RAPORT_2023\/Samotnosc-wsrod-osob-80%2B-Raport-z-badania_zapis_2023.pdf\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.malibracia.org.pl\/assets\/RAPORT_2023\/Samotnosc-wsrod-osob-80%2B-Raport-z-badania_zapis_2023.pdf<\/mark><\/a><\/li>\n\n<li>Hunt, M. G., Marx, R., Lipson, C., Young, J., <em>No More FOMO? Limiting Social Media Decreases Loneliness and Depression<\/em>, <em>Journal of Social and Clinical Psychology<\/em> <a href=\"https:\/\/guilfordjournals.com\/doi\/10.1521\/jscp.2018.37.10.751\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/guilfordjournals.com\/doi\/10.1521\/jscp.2018.37.10.751<\/mark><\/a><\/li>\n\n<li>Braghieri, L., Levy, R., Makarin, A., <em>Social Media and Mental Health<\/em>, <em>American Economic Review<\/em>, <a href=\"https:\/\/www.aeaweb.org\/articles?id=10.1257%2Faer.20211218\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.aeaweb.org\/articles?id=10.1257%2Faer.20211218<\/mark><\/a><\/li>\n\n<li>Sherry Turkle, wywiad w <em>Harvard Gazette<\/em>, <em>Using AI chatbots to ease loneliness<\/em>, 2024 <a href=\"https:\/\/news.harvard.edu\/gazette\/story\/2024\/03\/lifting-a-few-with-my-chatbot\/\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/news.harvard.edu\/gazette\/story\/2024\/03\/lifting-a-few-with-my-chatbot\/<\/mark><\/a><\/li>\n\n<li>Mozilla Foundation, <em>Creepy.exe: Mozilla Urges Public to Swipe Left on Romantic AI Chatbots Due to Major Privacy Red Flags<\/em>, 2024 <a href=\"https:\/\/www.mozillafoundation.org\/en\/blog\/creepyexe-mozilla-urges-public-to-swipe-left-on-romantic-ai-chatbots-due-to-major-privacy-red-flags\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.mozillafoundation.org\/en\/blog\/creepyexe-mozilla-urges-public-to-swipe-left-on-romantic-ai-chatbots-due-to-major-privacy-red-flags<\/mark><\/a><\/li>\n\n<li>Barry Collins, <em>Your AI Girlfriend Is Cheating On You, Warns Mozilla<\/em>, <em>Forbes<\/em>, 2024 <a href=\"https:\/\/www.forbes.com\/sites\/barrycollins\/2024\/02\/14\/your-ai-girlfriend-is-cheating-on-you-warns-mozilla\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.forbes.com\/sites\/barrycollins\/2024\/02\/14\/your-ai-girlfriend-is-cheating-on-you-warns-mozilla<\/mark><\/a><\/li>\n\n<li>UNICEF, <em>Policy guidance on AI for children<\/em>, wersja 2.0, 2021 <a href=\"https:\/\/www.unicef.org\/innocenti\/media\/1341\/file\/UNICEF-Global-Insight-policy-guidance-AI-children-2.0-2021.pdf\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">https:\/\/www.unicef.org\/innocenti\/media\/1341\/file\/UNICEF-Global-Insight-policy-guidance-AI-children-2.0-2021.pdf<\/mark><\/a><\/li><\/ol><p><\/p>","protected":false},"excerpt":{"rendered":"<p>Loneliness is becoming one of the biggest challenges of the 21st century\u2014even though we\u2019ve never been this &#8220;connected&#8221; before. The rise of AI and apps that promise closeness forces us to ask a tough question: does technology ease the problem, or does it actually make it worse?<\/p>\n","protected":false},"author":568,"featured_media":16823,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"rank_math_lock_modified_date":false,"footnotes":""},"categories":[799],"tags":[],"popular":[],"difficulty-level":[38],"ppma_author":[974],"class_list":["post-16932","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-lifestyle-2","difficulty-level-medium"],"acf":[],"authors":[{"term_id":974,"user_id":568,"is_guest":0,"slug":"zbigniew-rzepkowski","display_name":"Zbigniew Rzepkowski","avatar_url":{"url":"https:\/\/haimagazine.com\/wp-content\/uploads\/2025\/10\/zbigniew-rzepkowski-scaled.jpg","url2x":"https:\/\/haimagazine.com\/wp-content\/uploads\/2025\/10\/zbigniew-rzepkowski-scaled.jpg"},"first_name":"","last_name":"","user_url":"","job_title":"","description":"Project Manager i  AI Manager. \u0141\u0105czy \u015bwiat biznesu, technologii i humanistyki. Pisze o sztucznej inteligencji z perspektywy praktyka i obserwatora przemian \u2013 o etyce, geopolityce i wp\u0142ywie AI na cz\u0142owieka. W swoich tekstach szuka r\u00f3wnowagi mi\u0119dzy innowacj\u0105 a odpowiedzialno\u015bci\u0105.  "}],"_links":{"self":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/16932","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/users\/568"}],"replies":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/comments?post=16932"}],"version-history":[{"count":1,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/16932\/revisions"}],"predecessor-version":[{"id":16933,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/16932\/revisions\/16933"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media\/16823"}],"wp:attachment":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media?parent=16932"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/categories?post=16932"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/tags?post=16932"},{"taxonomy":"popular","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/popular?post=16932"},{"taxonomy":"difficulty-level","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/difficulty-level?post=16932"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/ppma_author?post=16932"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}