present tendencies on biometric recognition – Official Weblog of UNIO Defend Cyber

Maria Inês Costa (PhD Candidate on the Faculty of Legislation of the College of Minho. FCT analysis scholarship holder – UI/BD/154522/2023) 

In Portugal, greater than 300,000 individuals have already “bought” their iris scan to Worldcoin Basis, which in return affords them cryptocurrency. In March 2024, the Portuguese knowledge safety authority (hereinafter, the CNPD) determined to droop the corporate’s assortment of iris and facial biometric knowledge for 90 days as a way to defend the suitable to the safety of non-public knowledge, particularly of minors, following within the footsteps of Spain, which additionally briefly banned the corporate’s actions for privateness causes.[1]

In a press release, the CNPD explains that the corporate has already been knowledgeable of this short-term suspension, which is able to final till the investigation is accomplished and a last choice is made on the matter. The adoption of this pressing provisional measure comes within the wake of “dozens of experiences” obtained by the CNPD within the final month, which report the gathering of information from minors with out the authorisation of their dad and mom or different authorized representatives, in addition to deficiencies within the data offered to knowledge topics, the impossibility of deleting knowledge or revoking consent.[2] In CNPD’s press launch, one can learn that “[g]iven the present circumstances, in which there’s illegal processing of the biometric knowledge of minors, mixed with potential infringements of different GDPR guidelines, the CNPD thought of that the danger to residents’ basic rights is excessive, justifying an pressing intervention to forestall severe or irreparable hurt.”[3]

In its detailed suspension choice, one is supplied with necessary data, particularly the truth that, by means of complaints, it was revealed that some knowledge topics solely grew to become conscious of the dangers concerned within the processing of their knowledge resulting from media publicity of the matter, and that these dangers have been by no means correctly defined to them; moreover, they have been allegedly not supplied with data on the processing carried out, particularly on the information truly collected and for what functions, nor on how one can train the rights offered for within the legislation on the safety of non-public knowledge; and, as reported by the media, there are a selection of residents who authorise this knowledge assortment and subsequent processing as a result of they’re economically weak and/or are usually not totally conscious of the goals and implications of their participation within the Worldcoin challenge.[4]

And what’s this challenge all about? Cofounded by OpenAI CEO Sam Altman, the Worldcoin Basis’s white paper entitled “A New Identification and Monetary Community”[5] outlines the targets of scanning individuals’s iris and facial biometrics. In line with the doc, “[i]f profitable, Worldcoin might significantly improve financial alternative, scale a dependable resolution for distinguishing people from AI on-line whereas preserving privateness, allow international democratic processes, and present a possible path to AI-funded UBI.[6] Worldcoin consists of a privacy-preserving digital id community (World ID) constructed on proof of personhood and, the place legal guidelines permit, a digital foreign money (WLD).”

For the corporate, in a world the place AI is turning into increasingly highly effective, there may be an pressing want for “proof of personhood”, and essentially the most viable strategy to difficulty it’s by means of customized biometric {hardware} – the Orb. The Orb captures high-quality iris photographs with greater than an order of magnitude increased decision in comparison with iris recognition requirements, by means of which a ‘World ID’ is created – a “digital id resolution enabling customers to show their uniqueness and humanity anonymously […]”,[7] in response to Worldcoin.

Although the aim of Worldcoin considers the actions carried out are preserving of privateness, current complaints and bans supply a contrasting perspective. Certainly, Eileen Guo and Adi Renaldi from the MIT Know-how Evaluate printed, in April 2022, a protracted article exposing most of the firm’s weaknesses and challenges it presents. As an example, they interviewed Iyus Ruswandi, a neighborhood Indonesian who was tempted to “promote” his iris to Worldcoin Indonesia, in December 2021. The representatives would acquire the scans, in return for “free money (usually native foreign money in addition to Worldcoin tokens) to Airpods to guarantees of future wealth […] What they weren’t offering was a lot data on their actual intentions.”[8] Within the writer’s interview with Ruswandi, he said that representatives even had to assist residents arrange emails and go browsing to the online, main him to replicate on why Worldcoin was focusing on low-income communities within the first place, relatively than crypto fans or communities.[9]

From the accounts gathered to date, it’s potential to witness how this follow has affected weak communities, from populations who wouldn’t have entry to essentially the most up-to-date digital literacy, to kids, who can’t validly consent to such a follow. However there may be additionally an inequality of data between those that have bought their irises and the corporate, just because the latter has apparently not been totally clear about its operations. On this context, it’s related to consult with recital 20 of the EU AI Act which determines that “[i]n order to acquire the best advantages from AI techniques whereas defending basic rights, well being and security and to allow democratic management, AI literacy ought to equip suppliers, deployers and affected individuals with the mandatory notions to make knowledgeable selections concerning AI techniques. These notions might differ with regard to the related context and may embrace […], within the case of affected individuals, the information vital to grasp how selections taken with the help of AI will have an effect on them […]”.[10] (Creator’s daring). And though this reference to digital literacy on this recital includes totally different teams of individuals, it’s true that Article 4 of the AI Act states that “[p]roviders and deployers of AI techniques shall take measures to make sure, to their greatest extent, a adequate stage of AI literacy of their employees and different individuals coping with the operation and use of AI techniques on their behalf […]”, placing the emphasis on those that work intently with the know-how.

The situation examined on this textual content is especially worrying, particularly as we’re coping with biometric knowledge, which “can allow the authentication, identification or categorization of pure individuals and the popularity of feelings of pure individuals”.[11] As mentioned within the European Parliament’s 2021 examine “Biometric Recognition and Behavioural Detection”, to uniquely establish pure individuals, “sturdy”[12] biometric identifiers have to be captured, transformed into digital knowledge and in the end right into a standardised template. These identifiers will be captured by acceptable bodily scanners, with the lively aware cooperation of the person, remotely with out such cooperation, or with the assistance of present different knowledge. Thus, capturing biometric identifiers means changing an individual’s distinctive bodily traits into digital knowledge, resulting in the “datafication” of people.

As a result of the options that uniquely establish an individual are a part of an individual’s physique, their assortment and use intrude with an individual’s private autonomy and dignity, the report stresses. As soon as a biometric template has been created and saved in a reference database, anybody in possession of that template can establish and find that individual wherever on the earth, placing that individual at severe threat of being tracked and monitored. Additionally, it may be used to establish the person for a vast variety of functions and conditions.[13] In truth, whereas the danger of fraud and the difficulties posed by poor knowledge high quality or lacking knowledge are lowered by fashions that use “sturdy” biometrics, these “additionally improve moral issues, as they permit extra environment friendly public surveillance and can be utilized for the creation of elaborate profiles”.[14]

In line with Article 5(1)(h) of the AI Act, the usage of “real-time” distant biometric identification techniques[15] in areas accessible to the general public for the aim of sustaining public order is prohibited, until and to the extent that such use is strictly vital for outlined functions within the Regulation.[16] In relation to the usage of “put up” biometric identification techniques (in deferred time),[17] that is thought of a high-risk follow, and never prohibited just like the one described above, though the end result is identical – large identification of topics with out their consent or information, one thing which is intrusive in nature. A lot criticism has been directed at the truth that the latter is just not solely prohibited, and that the previous contains exceptions to its prohibition.[18] Therefore, that highlights how the follow of biometric identification carries a really excessive threat of threatening fundamental rights and safeguards,[19] and in the end democracy itself.

Now, when contemplating the operations of firms which make use of biometric identification as its primary exercise, and the needs for which all of the delicate knowledge shall be used will be very questionable, we step on to very harmful territory. On this regard, it’s related to contemplate Alfonso Ballesteros’ insights in his article “Digitocracy: ruling and being dominated”: “[d]igitocracy[20] appears to be a brand new type of authorities […] a brand new strategy to rule an unprecedented variety of individuals neatly and effectively. […] Rulers aren’t any extra trendy technocratic humanists than mere rational entrepreneurs looking for to earn cash. They’re postmodern entrepreneurs [who] have been in a position to hybridise their financial pursuits with new postmodern concepts; particularly, those who blur the distinctions between artefacts and people, and a declared pretension to be performing for the great of humanity.”[21]

Lately, it is a matter for essentially the most cautious consideration, as with out sturdy sufficient safeguards, we shall be more and more topic to vested pursuits making use of our most delicate data, and that would result in a path of no return. Thus, within the face of a suggestion of “proof of personhood”, we needs to be involved as as to if that is weakening our very personal autonomy and dignity – in essence, our humanness – or if it can improve and, quite the opposite, defend our life in coexistence with know-how.

[1] See Elizabeth Howcroft, “Portugal orders Sam Altman’s Worldcoin to halt knowledge assortment”, Reuters, 26 March 2024, See additionally Expresso, “Worldcoin: Comissão de Proteção de Dados suspende recolha de dados da íris”, 26 March 2024,

[2] CNPD, “CNPD suspende recolha de dados biométricos”, 26 March 2024,

[3] The total textual content of the press launch is offered at: 

[4] CNPD, “DELIBERAÇÃO/2024/137”, AVG/2023/1205, 4,

[5] Worldcoin Basis, “A New Identification and Monetary Community”, Worldcoin Whitepaper,

[6] UBI stands for common fundamental earnings.

[7] Worldcoin Basis, “World ID – The protocol to deliver privacy-preserving international proof of personhood to the web”,

[8] Eileen Guo and Adi Renaldi, “Human and know-how – Deception, exploited staff, and money handouts: how Worldcoin recruited its first half 1,000,000 check customers”, MIT Know-how Evaluate, 6 April 2022,

[9] Eileen Guo and Adi Renaldi, “Human and know-how – Deception, exploited staff, and money handouts: how Worldcoin recruited its first half 1,000,000 check customers”.

[10] European Parliament legislative decision of 13 March 2024 on the proposal for a regulation of the European Parliament and of the Council on laying down harmonised guidelines on Synthetic Intelligence (Synthetic Intelligence Act) and amending sure Union Legislative Acts, P9_TA(2024)0138 (COM(2021)0206 – C9-0146/2021 – 2021/0106(COD)), (Hereinafter, EU AI Act or AI Act).

[11] EU AI Act, Recital 14.

[12] These are, in response to the examine, fingerprint, iris, or retina.

[13] European Parliament, “Biometric Recognition and Behavioural Detection – Assessing the moral features of biometric recognition and behavioural detection methods with a concentrate on their present and future use in public areas”, Examine requested by the JURI and PETI committees, Coverage Division for Residents’ Rights and Constitutional Affairs, Directorate-Common for Inner Insurance policies, PE 696.968, August 2021, 44, 

[14] European Parliament, “Biometric Recognition and Behavioural Detection – Assessing the moral features of biometric recognition and behavioural detection methods with a concentrate on their present and future use in public areas”, 14.

[15] EU AI Act, Recital 17: “[…] ‘Actual-time’ techniques contain the usage of ‘stay’ or ‘near-live’ materials, reminiscent of video footage, generated by a digicam or different system with comparable performance. […]”

[16] EU AI Act, Recital 33: “These conditions contain the seek for sure victims of crime together with lacking individuals; sure threats to the life or to the bodily security of pure individuals or of a terrorist assault; and the localisation or identification of perpetrators or suspects of the felony offences listed in an annex to this Regulation, the place these felony offences are punishable by a custodial sentence or a detention order for a most interval of not less than 4 years within the Member State involved in accordance with the legislation of that Member State. Such a threshold for the custodial sentence or detention order in accordance with nationwide legislation contributes to making sure that the offence needs to be severe sufficient to doubtlessly justify the usage of ‘real-time’ distant biometric identification techniques.”

[17] EU AI Act, Recital 17: “[…] Within the case of ‘put up’ techniques, in distinction, the biometric knowledge have already been captured and the comparability and identification happen solely after a major delay. This includes materials, reminiscent of photos or video footage generated by closed circuit tv cameras or personal gadgets, which has been generated earlier than the usage of the system in respect of the pure individuals involved.”

[18] See Patrick Breyer, Sergey Lagodinsky and Kim van Sparrentak, “Defending privateness: biometric mass surveillance and the AI Act”, The Greens/EFA within the European Parliament, 6 March 2024,

[19] As an example, “[…] AI techniques figuring out or inferring feelings or intentions of pure individuals on the idea of their biometric knowledge might result in discriminatory outcomes and will be intrusive to the rights and freedoms of the involved individuals. Contemplating the imbalance of energy within the context of labor or training, mixed with the intrusive nature of those techniques, such techniques might result in detrimental or unfavourable remedy of sure pure individuals or complete teams thereof.” – Recital 44, EU AI Act.

[20] “Digitalisation as a type of authorities”.

[21] Alfonso Ballesteros, “Digitocracy: ruling and being dominated”, Philosophies 5, 9 (2020): 11,

Image credit: by Wojtek Paczeu015b on

Leave a Comment