The metaverse, which will merge physical reality with the digital world by way of virtual reality, augmented reality and the Internet, is coming. But the question is — are we ready?
According to Chris Wylie, the Cambridge Analytica whistleblower that spurred a massive overhaul of privacy practices and legislation across the globe, the answer to the above question is a resounding no.
In fact, it might be best to start pumping the breaks on the metaverse now before it infiltrates our day-to-day lives, he warns.
Recently speaking with Cheetah Digital’s CMO Richard Jones about the metaverse, marketing and the future of privacy, Chris likened the metaverse to a physical entity like a skyscraper or an airplane. He paints a picture of how digital worlds should be architected with so-called ‘fire exits’ or other protections.
“It should have a building code of sorts, protecting its users, their privacy, and, importantly, their mental health,” he says.
Because even though the metaverse is thought to be built to allow humans to freely roam around; the reality is, it could be designed to present its users with a particular narrative that slants their reality instead.
Richard agrees it’s something that we as citizens, not just as marketers, need to consider as we, our children, and their children’s children get deeper into a digital society.
“We need to ask questions — “Is there a need for a regulatory body or some layer of societal protections?” — as the metaverse rolls out and touches all aspects of our lives,” Richard adds.
Technology is growing more and more sophisticated with each passing year. The ways of collecting data that technology can power our lives are near-infinite. Who would have ever thought that one day, we would be able to walk into a grocery store, skip the checkout and still pay for our goods? Yet here we are, this day has arrived thanks to sophisticated camera technology that can trace our every move.
While this is mind-blowing stuff, Chris believes that we’re only scratching the surface. “In the current sort of direction we’re headed, I imagine 10 years from now, you’ll come home and sit down to watch TV while your TV watches you,” Chris says.
“The TV will be having a conversation with the appliances in the kitchen. What could the TV present to you, to convince you to buy something? And in another room, Facebook will be watching your kids play. And your self-driving car is deciding on the time you get to work.”
While the future, indeed, sounds ‘creepy’; with today’s capabilities, it’s entirely possible. “In that situation, there are a lot of things that, in isolation, seem mundane. A smart fridge, a smart TV or a smart car on its own doesn’t seem insidious,” he adds.
“It’s when you network these things and put them into a system that is capable of watching you, thinking about you, and creating plans and intentions for you that results in something really profound in terms of human agency. For the first time in human history, we are constructing environments around us that think.”
Nature versus technology
The modern landscape has us living in a world that’s full of natural delights as well as potentially threatening forces of technology, Chris points out. “We evolved as a species where nature might affect us. There might be a lion in the Savannah that will chase us or an antelope that we want to eat; but nature, itself, doesn’t have intentions for us.”
“As the metaverse continues to unfold, what does it mean to be a person who’s in an environment where everything around them suddenly has intentions — and we can’t see what those intentions are?”
Beyond intentions, Chris shares that it’s important to consider what kind of effect the metaverse will have on human development. What will it mean for people who have grown up in an environment where everything consumed has been working diligently to turn them into a consumer?
People have largely become who they are by navigating freely through their world. “It’s through your experience in life, dabbling, and random happenstance, that allows you to grow and develop as a person. But what happens when, all of a sudden, the environment decides to get involved – classifying you, influencing your every move, and ultimately grooming you into the ideal consumer?”
A shocking realisation of it all, Chris says, is that privacy is only one small piece of the puzzle. The matter at hand is much larger and much more complex than the data that’s collected and the manner in which it was collected.
“When we are looking at some of the consequences of algorithmic harm, whether that’s mental health and mental wellbeing — particularly in young, developing men and women — to social cohesion across the globe where actual harm is stemming from these systems, it’s critical that we address these consequences prior to the metaverse becoming mainstream,” Chris says.
Constructing a new paradigm
When there are more safety regulations for a toaster in a kitchen than there are for a platform that touches a billion people, Chris says, it’s time for change. The best way to realise that change, he says, is to start at the source.
“A big part of the issue is that we are not framing the conversation around those who are responsible — the engineers and architects. The things that are causing harm are the products of architecture and engineering,” he says.
“When you look at how we relate other products of technology and other products of engineering whether that’s in aerospace, civil engineering, pharmaceuticals, etc.; there are safety standards. There are inspections. We need to start scrutinising the technological constructions on the internet to ensure that there are regulatory frameworks in place to create a safer environment.”
While today, many use smart devices periodically for tracking health statistics, communicating with loved ones and entertainment, Chris says that in the coming decades, these smart devices might have a stronger hold on their users. Smart devices may become the only way that they can interact with modern society.
“Imagine 10 or 20 years from now where the internet evolves into the metaverse, where you can’t participate in society without entering this augmented reality. And then imagine an institution like Fox News taking over people’s reality — not just what they watch on TV, but literally, what they see,” he says.
Even more, Chris questions what happens when people begin personalising their experiences to suit their preferences. For example, racists could eliminate people of colour from their view. Or people could create a society where they walk down the street and no longer see homeless people; they no longer see larger societal problems.
“What happens to a society when we no longer fully understand what’s happening around us, and the only people who do are those in charge of augmenting it,” he asks. “That’s a really important question, and it’s not farfetched.”
At Face(book) value
Despite Facebook’s, now called Meta, attention-grabbing algorithms that encourage one-sided views and fuel disinformation, marketers continue to pump large sums of money into its platform every year. But is it worth it? Are marketers correct to believe that Facebook’s data is truly as valuable as they think it is?
According to Chris, absolutely. “From a purely functional standpoint, yes. It’s incredibly valuable data,” he admits.
However, marketers using that data to create personalised adverts isn’t the problem. The problem, he says, is a bit more involved than that. “There’s a difference between personalised advertising and creating an entire ecosystem using that logic,” Chris points out.
“When you look at the news feed that Facebook and other social media platforms provide to users, it extends this logic that originated for advertising, showing content that is relevant to you.
“It’s not just the basic things that make your ads more efficient, and also less annoying for the people receiving them. It extends to, ‘You should only see things that engage you the most, full stop, in all information that you consume,’ to the point where the only information that you consume is the thing that usually makes you really angry because that’s what’s going to make you click on stuff. And that’s different from marketing.”
Doing the right thing
To support marketers and advertisers in their role within this modern, tech-enabled society, Chris offers a word of advice: Don’t trust a wolf in sheep’s clothing.
“The tool that you’re using, that you love so much, is probably one of your biggest threats. While advertisers cringe at the idea of regulations that limit what they’re allowed to do, in the long run, regulations might be beneficial for the viability of the industry,” he says.
“Don’t get dragged down by bad practices within an industry that is behaving badly. There is a substantial loss of trust in platforms like Facebook because it continuously doesn’t listen to consumers. It doesn’t respect consumers. Siding yourself with that industry could backfire in the long run.”
Chris advises asking yourself these two questions as a simple rule of thumb when considering privacy and the misuse of data in advertising:
1. If you’re using personal data, would you be comfortable asking a random person on the street the questions to create that database?
2. Would somebody reasonably expect to have their data used in that way?
If the answer is yes to both, then it’s probably OK.
In summary, metaverse or not, it really doesn’t matter where you engage your customers. The reality is zero-party data, loyalty, and thinking through the ‘value exchange’ of how you engage with a customer are all vital components. They are all going to be important as the metaverse rolls out, and we move into this ‘brave new era’ of privacy.