{"id":16417,"date":"2025-11-25T17:17:26","date_gmt":"2025-11-25T16:17:26","guid":{"rendered":"https:\/\/haimagazine.com\/uncategorized\/the-faceless-minister-a-political-fantasy-with-ai-in-the-lead-role\/"},"modified":"2025-11-28T11:27:54","modified_gmt":"2025-11-28T10:27:54","slug":"the-faceless-minister-a-political-fantasy-with-ai-in-the-lead-role","status":"publish","type":"post","link":"https:\/\/haimagazine.com\/en\/law-and-ethics\/the-faceless-minister-a-political-fantasy-with-ai-in-the-lead-role\/","title":{"rendered":"\ud83d\udd12 The faceless minister. A political fantasy with AI in the lead role"},"content":{"rendered":"<p>Diella was already a familiar face to the citizens of Albania as the virtual assistant on the e-Albania platform. She answered questions about documents, appointments and forms \u2014 patiently, without any emotion and without coffee breaks. She became a symbol of the digital modernization of administration.<\/p><p>On September 11, 2025, Prime Minister Edi Rama announced that Diella would take up the role of Minister of State for Artificial Intelligence. His decision was unprecedented as, for the first time in history, an artificial intelligence system was given the status of a government member. Artificial intelligence was to be a symbol of the honesty that politicians were lacking. Rama argued that Diella would put an end to bribes and informal arrangements in public tenders.<\/p><p>However, the next day, President Bajram Begaj signed a decree that didn\u2019t give Diella any real powers. All the authority and responsibility stayed with the PM. So the algorithm had a title, but not a portfolio of powers.<\/p><p>It\u2019s worth mentioning that a month later, the Albania&#8217;s PM announced that Diella was &#8220;pregnant&#8221;:  her &#8220;children&#8221; being 83 AI assistants who would participate in parliament sessions. They&#8217;re supposed to observe the proceedings and advise lawmakers on how to respond to specific legislative proposals. Diella herself said in one of her speeches that her &#8220;children&#8221; are not intended to replace officials but to &#8220;enhance their capacity to serve&#8221; and help with repetitive tasks.<\/p><p>This shows how the role of AI is evolving in the &#8220;authorities&#8221; of Albania \u2014 it&#8217;s no longer about taking action, but about giving advice based on collected data.<\/p><h4 class=\"wp-block-heading\"><strong>Algorithms don&#8217;t take an oath<\/strong><\/h4><p>The legal issue here is that Diella doesn&#8217;t take an oath, isn&#8217;t answerable to parliament and can&#8217;t be held accountable. This way, she bypasses the core mechanisms of institutional oversight. In a democracy, a minister needs to have a name, a surname, a face you can see, especially when things go south. Handing this role over to an algorithm isn&#8217;t a brave experiment, but an act of political irresponsibility.<\/p><p>Experts have a term for it: algorithmic governance. Although the term sounds modern, in practice, well-known historical pathologies can lurk behind this concept. The first of these is the algorithmic bias. It&#8217;s not about creating new stereotypes but digitally reinforcing those that have been around in society for ages. Algorithms learn from historical data, reproducing and solidifying racial, gender or economic biases, which then take the form of seemingly impartial decisions.<\/p><p>The second phenomenon is algorithmic corruption. It&#8217;s a modern form of abuse where data and algorithm design become the sources of bias. Just by tweaking a few parameters or criteria, the system can &#8220;objectively&#8221; point to a preferred outcome. Everything looks right on the screen, the process seems transparent, but the result was pre-planned. It&#8217;s a new form of corruption, and that&#8217;s exactly why overseeing AI systems is becoming crucial.<\/p><h4 class=\"wp-block-heading\"><strong>New technology, old patterns<\/strong><\/h4><p>In theory, the principle of such oversight is already defined by the EU AI Act, which came into effect in 2024. According to this regulation, artificial intelligence systems used by public authorities must operate under real human supervision.<\/p><p>Even though public procurement wasn&#8217;t directly mentioned as a high-risk area, its importance for the transparency of public life means it follows the same logic.<\/p><p>Systems used for evaluating bids, awarding contracts, or verifying contractors directly impact how public funds are managed and the equality of business entities. Under EU law, these areas are seen as particularly sensitive to errors, misuse and discrimination.<\/p><p>For Albania, which is aspiring to join the European Union, the AI Act is not just a benchmark but also a political guidepost. They&#8217;ve committed to gradually implementing European standards for transparency, data protection and the responsible use of artificial intelligence. So, the AI Act clearly states: no decision affecting citizens&#8217; rights or finances should be made fully automatically. That&#8217;s the theory, anyway.<\/p><p>The practice looks different.<\/p><h4 class=\"wp-block-heading\"><strong>Between promise and reality<\/strong><\/h4><p>Formally, the responsibility for the operation of Diella lies with the prime minister, according to article 2 of the presidential decree from September 12, 2025. In reality, though, the algorithm analyzes offers, verifies documents and prepares the recommendations that shape the final decisions. People oversee the system but can&#8217;t see exactly how its mechanism works. And when the code and training data remain confidential, even the most well-defined responsibility becomes purely theoretical.<\/p><p>This doesn&#8217;t mean AI should be completely kept away from power. In Brazil, analytic systems have exposed hundreds of suspicious contracts. In Europe, the Digiwhist project helps detect abuses in procurement. But it&#8217;s important to note that in these cases, AI acted as a detective, not as a minister.<\/p><p><\/p><p>In a democracy, technology can support decision-making processes, but it shouldn&#8217;t take over. The responsibility for decisions \u2014 be it political, administrative or social \u2014 must always remain with the people.<\/p>","protected":false},"excerpt":{"rendered":"<p>Would you trust an algorithm if it promised to operate transparently, legally and without bias? Albania just found out that a faceless minister won&#8217;t solve the corruption problem.<\/p>\n","protected":false},"author":452,"featured_media":16338,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"rank_math_lock_modified_date":false,"footnotes":""},"categories":[805],"tags":[],"popular":[],"difficulty-level":[],"ppma_author":[843],"class_list":["post-16417","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-law-and-ethics"],"acf":[],"authors":[{"term_id":843,"user_id":452,"is_guest":0,"slug":"ewa-siciak","display_name":"Ewa Siciak","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/ebee7b40b99f8697a441e9c4e843181f1ee192815264acf918eb0ef697dfbe3e?s=96&d=mm&r=g","first_name":"Ewa","last_name":"Siciak","user_url":"","job_title":"","description":"Administratywistka i entuzjastka prostych rozwi\u0105za\u0144 AI dla ma\u0142ych firm. Bada i testuje technologie, kt\u00f3re realnie odci\u0105\u017caj\u0105 przedsi\u0119biorc\u00f3w."}],"_links":{"self":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/16417","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/users\/452"}],"replies":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/comments?post=16417"}],"version-history":[{"count":1,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/16417\/revisions"}],"predecessor-version":[{"id":16418,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/posts\/16417\/revisions\/16418"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media\/16338"}],"wp:attachment":[{"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/media?parent=16417"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/categories?post=16417"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/tags?post=16417"},{"taxonomy":"popular","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/popular?post=16417"},{"taxonomy":"difficulty-level","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/difficulty-level?post=16417"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/haimagazine.com\/en\/wp-json\/wp\/v2\/ppma_author?post=16417"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}