{"id":18373,"date":"2026-04-30T12:23:47","date_gmt":"2026-04-30T10:23:47","guid":{"rendered":"https:\/\/haimagazine.com\/uncategorized\/whose-is-your-twin\/"},"modified":"2026-05-08T13:12:45","modified_gmt":"2026-05-08T11:12:45","slug":"whose-is-your-twin","status":"publish","type":"post","link":"https:\/\/haimagazine.com\/en\/hai-premium-2\/whose-is-your-twin\/","title":{"rendered":"\ud83d\udd12 Whose is your twin?"},"content":{"rendered":"<p>After a few minutes, the system shows an elevated risk of a cardiovascular event in the coming years. The doctor recommends treatment and a lifestyle change. You leave the doctor&#8217;s office with a prescription and a question: who actually made this decision?<\/p><p>This is no longer pure science fiction. Natalia Trayanova&#8217;s team at Johns Hopkins University has been developing cardiac digital twins for years, used to simulate procedures and support ablation planning. In the literature, such models are described as tools for predicting disease progression and testing intervention options before applying them to a specific patient.<\/p><p>At the same time, the very concept of the medical digital twin is evolving. In an article published in The Lancet Digital Health, five core components of such a system were proposed: the patient, a data connection, an &#8220;in silico&#8221; patient model, an interface, and a mechanism for synchronizing the twin with real-world data. This is important because it shows that it&#8217;s not about a single algorithm, but rather an entire technological ecosystem.<\/p><p>It sounds promising. More personalized medicine, better prognoses, fewer decisions based solely on population-level averages. The problem starts when we ask not only about effectiveness, but also about control over the model. Who owns your digital twin?<\/p><p>In the United States, the issue of medical data ownership has long been complex. Put simply, the patient has broad rights to access and control the use of information, but the physical or institutional medium of the medical record often remains under the control of the provider or the healthcare system. This distinction matters. If a digital twin model is created from data derived from multiple sources, the question of ownership and control becomes even more difficult.<\/p><p>And this is exactly where a legal gap emerges. In ethical legal analyses concerning digital twins, authors point out that legal precedents remain underdeveloped, and issues of ownership, control and benefit-sharing are, in practice, far from resolved. Proposals are emerging\u2014data trusts, patient data cooperatives, benefit-sharing mechanisms\u2014but they don&#8217;t constitute yet a uniform, widely implemented regulatory system.<\/p><p>And now the less pleasant part. Let\u2019s imagine that other institutions want to make use of the predictions generated by such a model. An insurer sees a long-term risk of type 2 diabetes and raises the premium. An employer receives a &#8220;risk profile&#8221; concerning mental health or burnout and starts making personnel decisions more cautiously\u2014which is, de facto, discriminatory. In litigation, one party tries to use a health simulation as evidence of predicted life expectancy. Some of these scenarios sound extreme today. But they\u2019re not absurd. They follow from the logic of predictive profiling. In Poland, you can already get a discount on insurance policies for sharing real-time data about your driving style, which makes you wonder when analogous surcharges will start to be applied.<\/p><p>In a similar context, it&#8217;s worth remembering the Cambridge Analytica scandal. It revealed the extent of abuses related to data collection and political microtargeting. However, it didn&#8217;t conclusively establish how effective psychographic influence on voting behavior actually was, as this remains the subject of ongoing academic debate. As a warning about profiling infrastructure, this example remains highly relevant. A medical digital twin would, in this regard, be a source of data far deeper than social media click history.<\/p><p>There&#8217;s also the matter of liability. Who is liable if a physician uses the recommendations of a system based on a digital twin and the patient is harmed? The physician, the software manufacturer, the hospital, the data integrator? Bioethical and legal literature has for years indicated that the attribution of responsibility (especially legal responsibility) remains an open question and is becoming increasingly ambiguous as systems grow more complex. This isn&#8217;t a technical detail, but the core of trust in medicine.<\/p><p>The European Union is trying to keep up. The AI Act classifies many AI systems used in healthcare as high-risk systems and imposes requirements regarding risk management, data quality, documentation, human oversight and transparency toward users. This is a real step forward. At the same time, the AI Act is a horizontal regulation. It doesn&#8217;t address in detail all the issues specific to medical digital twins, especially where they combine clinical, behavioral and environmental data into a single predictive model.<\/p><p>What do people think? In a study published in npj Digital Medicine, the majority of Swiss respondents expressed interest in the use of digital twins in medicine, but at the same time strongly opposed mandating their use. In other words, acceptance of the benefits doesn&#8217;t mean agreeing to coercion. Patients want effective medicine, but they also want to retain control.<\/p><p>The crux of the problem is simple. Your digital twin is not you. It\u2019s a model \u2014 useful, sometimes very accurate, sometimes fallible \u2014 built on data and assumptions. But decisions made based on it will affect you, not the model. If the model gets it wrong, if it overestimates risk, if the data biases built into it affect a particular group of patients, a person will bear the consequences.<\/p><p>So the most important question today is not: can we build ever better digital twins? We can and we will do so increasingly efficiently. The question is: who controls their use, who bears liability for damages, and what rights does the patient retain with respect to a model built from their life.<\/p><p>The answers to these questions will determine whether the digital twin becomes a tool of precision medicine or yet another mechanism of predictive control.<\/p>","protected":false},"excerpt":{"rendered":"<p>Imagine a scene from the near future. You go to the doctor with chest pain. Instead of immediately referring you for additional tests, the doctor runs a simulation on your \u201cdigital twin\u201d \u2014 a virtual model of your body powered by data from medical records, imaging studies, laboratory results, genetic information and wearable devices.<\/p>\n","protected":false},"author":247,"featured_media":18301,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"rank_math_lock_modified_date":false,"footnotes":""},"categories":[796,803,805],"tags":[],"popular":[],"difficulty-level":[38],"ppma_author":[614],"class_list":["post-18373","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-hai-premium-2","category-it-and-technology","category-law-and-ethics","difficulty-level-medium"],"acf":[],"authors":[{"term_id":614,"user_id":247,"is_guest":0,"slug":"prof-dr-hab-dariusz-jemielniak","display_name":"prof. dr hab. Dariusz Jemielniak","avatar_url":{"url":"https:\/\/haimagazine.com\/wp-content\/uploads\/2025\/03\/maxresdefault-1-e1742292469999.jpg","url2x":"https:\/\/haimagazine.com\/wp-content\/uploads\/2025\/03\/maxresdefault-1-e1742292469999.jpg"},"first_name":"Dariusz","last_name":"Jemielniak","user_url":"","job_title":"","description":"Profesor zarz\u0105dzania Akademii Leona Ko\u017ami\u0144skiego, gdzie kieruje katedr\u0105 MINDS (Management in Networked and Digital Societies). Pracuje te\u017c jako faculty associate w Berkman-Klein Center for Internet and Society na Harvardzie. Wiceprezes Polskiej Akademii Nauk. Cz\u0142onek Rady Programowej CampusAI."}],"_links":{"self":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/18373","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/users\/247"}],"replies":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/comments?post=18373"}],"version-history":[{"count":1,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/18373\/revisions"}],"predecessor-version":[{"id":18374,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/18373\/revisions\/18374"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media\/18301"}],"wp:attachment":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media?parent=18373"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/categories?post=18373"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/tags?post=18373"},{"taxonomy":"popular","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/popular?post=18373"},{"taxonomy":"difficulty-level","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/difficulty-level?post=18373"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/ppma_author?post=18373"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}