1. Home
  2. chatGPT
  3. Worldcoin is making reality look like a lot like Black Mirror
Worldcoin is making reality look like a lot like Black Mirror

Worldcoin is making reality look like a lot like Black Mirror

0

Source: Coin Telegraph

OpenAI’s sister company Worldcoin wants you to buy its coin — and possibly unleash a tempest of problems across the world in the process.

Scanning your iris to become a “verified human” in exchange for digital currency sounds like a Black Mirror episode. But this is not the story arc of a dystopian science fiction show — it’s happening now with one of the latest projects in Web3.

The launch of Worldcoin on Optimism has left many wondering whether this project is subverting Web3’s promise of decentralization to build just the opposite. And yet, more than 2 million people in underserved areas have already signed up to share their biometric data with Worldcoin in exchange for 25 WLD, worth less than $100 at the time of writing.

This is not just strange — it also presents serious privacy risks and creates a honeypot for bad actors. More so, there’s an argument to be made that it could even interfere with the sovereignty of foreign countries.

Why would we need Worldcoin in the first place?

Worldcoin was founded to solve for the expected externalities of its sister company, OpenAI — the creator of ChatGPT and other popular AI products. One hand is solving the problems the other hand is creating.

In the words of its founders: “If successful, we believe Worldcoin could drastically increase economic opportunity, scale a reliable solution for distinguishing humans from AI online while preserving privacy, enable global democratic processes, and eventually show a potential path to AI-funded UBI [universal basic income].”

The problem with Worldcoin

Despite the ambition and promise to safeguard privacy, a whole new set of problems arises from the fact that this is being done by a single, currently centralized company. The irony is not lost on ChatGPT. Some of its answers when prompted “What are the risks in having one company own biometric data for individuals in underdeveloped countries?” include:

  • Privacy violations
  • Security breaches
  • Surveillance and sovereignty

Ethereum co-founder Vitalik Buterin has echoed some of these concerns as well.

Having one company own biometric data for individuals in underdeveloped countries poses significant risks for individuals. On a broader societal scale, these are even more significant when coupled with UBI payments to foreign citizens.

Privacy violations

Biometric data like irises is highly sensitive and unique to each individual. It can reveal information such as sex, ethnicity and, perhaps, medical conditions. If a single company controls this data, there’s a high risk of privacy violations, as it can be used to track and monitor individuals without their consent.

Related: The world could be facing a dark future thanks to CBDCs

Who’s to say that the company would not exploit the biometric data for commercial gains, such as targeted advertising or selling the data to other entities? Isn’t that diametrically opposed to what we have been trying to achieve for the last few years?

Security breaches

Centralizing biometric data also puts it at a higher risk of being targeted by hackers and cybercriminals. This is what’s known in security sectors as a “honeypot” when used for controlled purposes. A large amount of attractive data is stored by a single entity in order to study a potential breach under the understanding that it will eventually be hacked.

Related: CBDCs will lead to absolute government control

A data breach at this scale could lead to severe consequences, including identity theft, fraud, and unauthorized access to the personal information of millions of people.

Surveillance and sovereignty

This data could also fall into the hands of governments to subpoena the data and obtain citizens’ personal information without a warrant. There are fewer protections when you sell your data to a third party. A corrupt government may use this data to manipulate behaviors, limit dissent and suppress opposition, essentially turning underdeveloped regions into surveillance states.

More so, if the company operates across borders, it could wield undue power and influence over governments and societies. Financially supporting a large number of foreign citizens under a universal basic income model could ultimately reduce the autonomy and sovereignty of a country’s democratic processes.

When visiting Worldcoin’s Orbs to scan their irises, registrants are given a promotional sticker that reads “Verified Human.” There’s a slight feeling of discomfort in being referred to simply as human here — not person.

In the context of selling your identity for a few bucks to a cryptocurrency project with ties to AI development, it almost sounds like a Freudian slip. It’s as if personhood is a forgotten idea, and now we’re just humans in a massive database of biometric data.

Sometimes reality really is stranger than fiction.

Matthew Niemerg is a co-founder and board member of the Aleph Zero Foundation. He holds a Ph.D. in Mathematics from Colorado State University and currently serves as an expert on the EU Blockchain Observatory and Forum. He is also a co-founder of Cardinal Cryptography.

This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Go to Source
Author: Matthew Niemerg