Chris Wylie, the data analyst and whistleblower behind the Cambridge Analytica/Facebook revelations that sparked a massive overhaul of privacy legislation across the globe, believes society is not ready to deal with the potential implications of the metaverse.
In conversation with Cheetah Digital’s CMO Richard Jones, Wylie compared the metaverse to a physical entity like a skyscraper or airplane to highlight how digital worlds should be architected to include ‘fire exits’ and other protections for users.
Wylie said: “The metaverse should have a building code of sorts, protecting its users, their privacy, and, importantly, their mental health. In the current sort of direction we’re headed, I imagine 10 years from now, you’ll come home and sit down to watch TV while your TV watches you. The TV will be having a conversation with the appliances in the kitchen. What could the TV present to you, to convince you to buy something? And in another room, Facebook will be watching your kids play. And your self-driving car is deciding on the time you get to work.
“In that situation, there are a lot of things that, in isolation, seem mundane. A smart fridge, a smart TV or a smart car on its own doesn’t seem insidious. It’s when you network these things and put them into a system that is capable of watching you, thinking about you and creating plans and intentions for you that results in something really profound in terms of human agency. For the first time in human history, we are constructing environments around us that think.”
He added: “As the metaverse continues to unfold, what does it mean to be a person who’s in an environment where everything around them suddenly has intentions — and we can’t see what those intentions are?”
Beyond intentions, Wylie believes we need to consider what kind of effect the metaverse will have on human development. What will it mean for people who have grown up in an environment where everything consumed has been working diligently to turn them into a consumer?
He said: “It’s through your experience in life, dabbling and random happenstance, that allows you to grow and develop as a person. But what happens when, all of a sudden, the environment decides to get involved – classifying you, influencing your every move, and ultimately grooming you into the ideal consumer?
“When we are looking at some of the consequences of algorithmic harm, whether that’s mental health and mental wellbeing — particularly in young, developing men and women — to social cohesion across the globe where actual harm is stemming from these systems, it’s critical that we address these consequences prior to the metaverse becoming mainstream.”
“A big part of the issue is that we are not framing the conversation around those who are responsible — the engineers and architects. When you look at how we relate other products of technology and other products of engineering whether that’s in aerospace, civil engineering, or pharmaceuticals, there are safety standards. There are inspections. We need to start scrutinising the technological constructions on the internet to ensure that there are regulatory frameworks in place to create a safer environment.”
The Metaverse, Marketing and Future of Privacy special talk track is available to view here.