{"id":10919,"date":"2025-05-05T10:00:00","date_gmt":"2025-05-05T08:00:00","guid":{"rendered":"https:\/\/haimagazine.com\/uncategorized\/when-human-meets-machine\/"},"modified":"2025-06-26T15:41:57","modified_gmt":"2025-06-26T13:41:57","slug":"when-human-meets-machine","status":"publish","type":"post","link":"https:\/\/haimagazine.com\/en\/ai-lifestyle\/when-human-meets-machine\/","title":{"rendered":"\ud83d\udd12 When human meets machine"},"content":{"rendered":"<p>It would seem that collaborating with AI depends mainly on the system&#8217;s functionality, the interface, or the user&#8217;s education level. Meanwhile, scientists from Ludwig Maximilian University in Munich and Waseda University in Tokyo have discovered that something more subtle may be key: <strong>culture<\/strong>. <\/p><h4 class=\"wp-block-heading\"><strong>AI in Japan and the USA \u2013 two different approaches to collaboration<\/strong><\/h4><p>To better understand how culture influences our willingness to collaborate with machines, a team of researchers from Munich and Tokyo conducted a series of experiments based on classic economic games: <em>Prisoner&#8217;s Dilemma<\/em> and <em>Trust<\/em>. In the study, participants from Japan and the United States interacted with both other people and artificial intelligence agents, choosing between selfish behavior and cooperation. The results turned out to be telling: Americans more frequently leveraged AI&#8217;s cooperative behaviors, trying to &#8220;rob&#8221; the machines of bonuses for fair play. For many of them, the AI agent was not a partner, but an opponent. Meanwhile, participants from Japan treated machines with a similar level of respect and trust as they would humans, which suggests that their social empathy reflexes also extend to technology.    <\/p><p>As Dr. Jurgis Karpus from the Ludwig Maximilian University of Munich points out:<br\/>&#8220;If people in Japan treat robots with the same respect as humans, fully automated taxis will gain traction in Tokyo much sooner than in Berlin, London, or New York&#8221; (<a href=\"https:\/\/www.nature.com\/articles\/s41598-025-92977-8\" data-type=\"link\" data-id=\"https:\/\/www.nature.com\/articles\/s41598-025-92977-8\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">Scientific Reports, 2024<\/mark><\/a>).<\/p><p>These insights perfectly fit into broader observations regarding the relationship between humans and technology. In the experiment <em><a href=\"https:\/\/core.ac.uk\/download\/pdf\/231922494.pdf\" data-type=\"link\" data-id=\"https:\/\/core.ac.uk\/download\/pdf\/231922494.pdf\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">Moral Machine<\/mark><\/a><\/em> conducted in 2016, which analyzed moral decisions regarding autonomous vehicles, different cultural differences proved to be crucial. Over 2 million people from more than 200 countries decided how autonomous cars should behave in situations of moral dilemmas\u2014akin to the <em>trolley problem<\/em>: whom to save, whom to sacrifice. Data analysis has revealed three main cultural clusters, distinguished by different ethical preferences concerning factors such as age, social status, and adherence to the law. For example, participants from Eastern countries, such as Japan or South Korea, often favored pedestrians who obeyed the rules, while participants from Western countries, such as the USA or Germany, more frequently chose the option that minimized the number of victims, regardless of the status or behavior of the individuals involved.   <\/p><p>In a more recent study, a different scenario was tested: two vehicles are approaching a narrow bridge\u2014will one of them yield, or will both drivers insist on having the right of way, risking a collision? Americans more often assumed that the machine should give way, making more self-interested decisions. Meanwhile, the Japanese maintained a high level of cooperation and social trust towards autonomous systems.  <\/p><p>The conclusions? Cultural differences: 1. have significant implications for the design of ethical algorithms in autonomous vehicles; 2. can greatly influence how fast new technologies are implemented, as well as the safety level in future autonomous systems.    <\/p><h4 class=\"wp-block-heading\"><strong>The spirit in the machine: what religion has to do with Alexa<\/strong><\/h4><p>At the heart of these different reactions lies a fundamental question: who\u2014or what\u2014is the machine to us?<\/p><p>In the Western cultural sphere, it&#8217;s simply a tool. In Japan, a potential partner. Why? Here is where religion comes into play. In Shintoism and Buddhism, even inanimate objects can be inhabited by spirits (<em>kami<\/em>). In such a world, a robot or voice assistant isn&#8217;t just cold code, but someone with whom you can build a relationship.     <\/p><p>In the article <em><a href=\"https:\/\/dl.acm.org\/doi\/abs\/10.1145\/3334480.3381809\" data-type=\"link\" data-id=\"https:\/\/dl.acm.org\/doi\/abs\/10.1145\/3334480.3381809\" target=\"_blank\" rel=\"noopener\"><mark style=\"background-color:#82D65E\" class=\"has-inline-color has-contrast-color\">Does Siri Have a Soul?<\/mark><\/a><\/em> William Seymour and Max Van Kleek describe a world in which AI functions like a <em>kami<\/em>: when treated with respect, it reciprocates with care (for instance, it might refuse to order fast food and suggest a healthier option &#8220;for your own good&#8221;). When neglected, it becomes capricious, errs, seeks revenge, and might even misinterpret commands. Absurd? It&#8217;s more of a metaphor, which aptly shows that our relationships with AI are much more emotional and &#8220;human&#8221; than we would like to admit.   <\/p><h4 class=\"wp-block-heading\"><strong>What if Siri actually has a personality?<\/strong><\/h4><p>Giving AI a voice is not just a UX move. It&#8217;s a cultural decision. Why does Siri sound like a nice woman? Why does Alexa always speak in a gentle, soothing tone? Because they embody entrenched stereotypes. Creators\u2014mainly from the West\u2014design AIs that are meant to seem familiar, predictable, helpful, and&#8230; feminine. Critics point out that this is a replication of a woman-servant image: always nice, obedient and available. Technologies such as Siri or Alexa are not neutral. Someone decided how they would speak, their names, their jokes and their personalities. This makes each of them a carrier of a specific worldview \u2014 a reflection of the values, ideologies and fears of their creators.         <\/p><p>Researchers emphasize that AI technologies are not just algorithms. They are social beings that we engage with, often unknowingly. The &#8220;humaneness&#8221; designed into these devices\u2014their voice, gender, personality\u2014aims to build trust and closeness.   <\/p><h4 class=\"wp-block-heading\"><strong>Social anxieties versus social needs<\/strong><\/h4><p>Stephen Hawking warned that AI could become humanity&#8217;s greatest blessing or curse. The West continues to balance between admiration and fear. AI fascinates but also concerns us, a bit like a new form of life that we don&#8217;t yet understand. In Japan, the approach is different. The country, grappling with an aging population, treats technology as an ally. Robots don&#8217;t take away jobs; they fill gaps\u2014helping where there is a shortage of people.     <\/p><p>These differences in perceiving AI also have serious consequences in medicine. In Japan, AI-based diagnostic technologies are being implemented smoothly due to a high level of public acceptance. In the US and Europe, building trust in medical algorithms requires more education and transparency.  <\/p><h4 class=\"wp-block-heading\"><strong>Don&#8217;t ask what a person will do with AI. Ask whether AI understands the person <\/strong><\/h4><p>Our readiness to cooperate with machines is not just a matter of technology. Every artificial intelligence is born in a specific place and carries within it traces of the values, fears and dreams of those who created it. <\/p><p>There isn&#8217;t just one universal AI. There are American, Chinese, Japanese, European, and African AIs, each speaking a different language and telling a different story. Global implementation of artificial intelligence requires understanding different cultural differences and adapting technology to local conditions. AI system designers should take these subtleties into account.   <\/p><p>Ultimately, whether we consider a machine an ally, a tool or a threat does not depend on how many parameters it has. The question is whether, besides data, it also perceives different contexts\u2014cultural, social, and human. <br\/><strong> <\/strong><\/p><p><\/p>","protected":false},"excerpt":{"rendered":"<p>Does our readiness to collaborate with machines depend on the culture in which we grow up? <\/p>\n","protected":false},"author":234,"featured_media":10726,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"rank_math_lock_modified_date":false,"footnotes":""},"categories":[792],"tags":[447,708,709,710],"popular":[],"difficulty-level":[36],"ppma_author":[544],"class_list":["post-10919","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-lifestyle","tag-ai-4","tag-ai-and-culture","tag-ai-in-japan","tag-trust-in-ai","difficulty-level-easy"],"acf":[],"authors":[{"term_id":544,"user_id":234,"is_guest":0,"slug":"joanna-kostecka","display_name":"Joanna Kostecka","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/ae43ac99602c46ef6be2602397c7de5877435d3d69dbab3f38579670340b696e?s=96&d=mm&r=g","first_name":"Joanna","last_name":"Kostecka","user_url":"","job_title":"","description":"Z wykszta\u0142cenia polonistka, graficzka, marketingowiec i wdro\u017ceniowiec rozwi\u0105za\u0144 AI. Prezeska Fundacji Fabryka Dobrych Projekt\u00f3w, propagatorka inkluzywno\u015bci. Entuzjastka AI i VR, szczeg\u00f3lnie w obszarze medycyny i healthcare."}],"_links":{"self":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/10919","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/users\/234"}],"replies":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/comments?post=10919"}],"version-history":[{"count":1,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/10919\/revisions"}],"predecessor-version":[{"id":10920,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/10919\/revisions\/10920"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media\/10726"}],"wp:attachment":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media?parent=10919"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/categories?post=10919"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/tags?post=10919"},{"taxonomy":"popular","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/popular?post=10919"},{"taxonomy":"difficulty-level","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/difficulty-level?post=10919"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/ppma_author?post=10919"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}