Miss a day, miss a lot. Subscribe to The Defender's Top News of the Day. It's free.

Prominent figures from the realms of Big Tech, Big Media and government gathered last month at the Athens Democracy Forum, where they shared their visions for what the metaverse may become — and how democratic institutions and educational systems can be transformed with the “aid” of new technology.

Organized under the aegis of The New York Times and marking its 10th anniversary, this year’s forum, held in Athens, Greece, included speakers such as United Nations (UN) Secretary-General Ban Ki-moon, European Commission President Ursula von der Leyen, Ukrainian President Volodymyr Zelensky and Jeffrey Sachs, director of Columbia University’s Center for Sustainable Development.

With Meta, Microsoft, the UN Democracy Fund and the Council of Europe among the key sponsors for this year’s gathering, many of the panel discussions and presentations unsurprisingly focused on the intersection between technology and politics.

Among the key topics of discussion were the metaverse — what it is and how it may evolve in the future — and how it, along with other new technologies, might “transform” the educational system and democratic institutions, which the participants adamantly claimed are failing or outdated.

Participants also discussed the “threat” of social media “fake news,” “misinformation,” “disinformation” and “echo chambers” that are out of the reach of traditional news media institutions and sources of information, and what can be done to curb this.

According to the forum’s introductory statement, while this “marked its tenth anniversary,” which the forum “can celebrate,” “democracy cannot” — necessitating “the brightest — and often youngest — minds to come up with a prescription for a new breed of democracy” and for “bold action” to combat “malignant threats.”

Panelists: metaverse at forefront of ‘the battle of democracy’

Promising “incredible applications that have enormous potential for society,” the metaverse — and the potential “solutions” it may provide for politics and education —  dominated discussions at the forum’s “Rethinking Tech: Life In and Out of the Metaverse” panel.

Participants included:

  • Carola Los Arcos Nagore, politics & government outreach manager, Southern Europe, Meta
  • Nanna-Louise Linde, vice president, European Government Affairs, Microsoft
  • Kyle Bozentko, executive director, Center for New Democratic Processes
  • Esther O’Callaghan, founder & CEO, hundo.xyz
  • Patricia Cohen, global economics correspondent, The New York Times, who moderated the panel

Nagore opened the discussion by describing the metaverse as “the next generation of the Internet,” as an “immersive internet” that is “going to be more like being inside the Internet rather than just experiencing it on two dimensions.”

This, however, does not imply that the metaverse is a “gaming platform,” said Nagore. Instead, “it’s obviously much more than that.”

The “much more” Nagore said included “real strong applications” that are being developed in “fields like education and healthcare” and “incredibly positive applications” with a “tremendous” impact in relation to education.

She stated:

“Creators can create on their own … what they want, without having to stick to old paradigms. And I think this is especially true on immersive learning.

“We see that you can learn differently … in a much more efficient way just by using different tools.”

Learning “differently,” according to Nagore, includes the use of virtual reality applications because “foundational skills like spatial awareness or the ability to abstract yourself from physicality cannot be developed, as you know, in school traditionally.” Nagores said these are “basic skills” that are “so important … for society.”

Nagore’s statements closely mirrored discussions that occurred at the World Economic Forum’s (WEF) annual meeting in Davos, Switzerland, in late May 2022, and also articles produced by the WEF that tout the benefits of the metaverse, virtual reality and artificial intelligence (AI) as a means of making education more economically efficient, potentially eliminating the need for brick-and-mortar schools.

O’Callaghan described “the future of learning and work” as “the battle of democracy.” Turning the focus to political institutions, O’Callagan said young people are “digital native, but they are also increasingly some of the most disenfranchised from democracy.”

What can be done about this? According to O’Callaghan, an “effective mechanism for rebuilding some of democracy” is “decentralization.” This refers to “bring[ing] together communities who are disenfranchised, disparate, into more immersive digital spaces” like the metaverse, where “important debate can start to be had again.”

O’Callaghan did not clarify why such spaces necessarily must be digital and why such objectives could not be accomplished up until now. Instead, she described a “kind of battle royale between centralized platforms and [a] more open metaverse” that is “more transparent, more self-sovereign data ownership, accountability, education and training.”

In this metaverse, one would “own” their personal data, said O’Callaghan.

As previously reported by The Defender, technology and defense contractors who developed digital “vaccine passports” have also claimed such applications enable users to “take control” of their own data.

For instance, on its website, the Thales Group proudly promotes its “smart health card” and Digital ID Wallet technology. Amidst utopian language claiming “we’re ready for change” and “putting citizens in control,” the Digital ID Wallet promises the public the ability to “access the rights and services to which we are entitled.”

For Nagore, the metaverse and its potential benefits “actually is a fairly logical progression of how communication technologies have evolved and are continuing to evolve over the years,” guided by a quest to make “communication as lifelike as possible in each step of the way, in each step of the chain.”

Linde, in turn, placed the metaverse within the broader context of “solving” the challenges the world faces today.

Linde said:

“We are challenged by a war. We are recovering from a pandemic. We have challenges around climate, [the] energy crisis. Our democracies are threatened.

“And I think technology has a big role to play in trying to find solutions to some of these challenges … to protect our values and protect our democracies.”

For Linde, the perceived “benefits” of the metaverse can be realized only by “cleaning up” the current internet first.

“I think we should make sure that we clean up our problems in the old Internet before we transfer them also to the metaverse: privacy, disinformation. I think that’s really crucial,” she said.

Despite calling for “cleaning up” purported “disinformation,” in the same breath, Linde remarked that “inclusiveness” is “going to be key that we make sure that the rules that the technology we build and the rules that we put in place … that they make sure that we embrace every person on the planet.”

Nagora, in turn, described training algorithms with AI “to make sure that it lives up to democratic values and inclusiveness.”

O’Callaghan expressed concerns over the spread of “misinformation and extreme ideas” on current social media, stating:

“When the idea of kind of pushing down control is the essence of democratic thought and as well as this idea … of ownership … we’ve also seen how that can result in kind of insulated communities where there is a lot of misinformation and extreme ideas that get circulated in kind of an echo chamber and actually end up being very insidious.

“How do we make sure we nudge them in the direction you are talking about and not in another way?”

As previously reported by The Defender, the concept of “nudging,” arising from the field of behavioral psychology, has been employed by governments and public health officials to, for instance, “encourage” certain behaviors, such as adherence to COVID-19-related restrictions.

Bozentko, in looking ahead at the next steps of the metaverse, said, “It doesn’t have to look and feel just like the internet as we’ve experienced it in the past.”

“It doesn’t necessarily have to just replicate the same experiences of echo chambers and other things that have happened, because the possibilities are so much broader and more limitless,” Bozentko said.

How can this happen? “Secure” verification of online identities, for instance, via a “much more verified process, which I think blockchain offers, is one of the more interesting use cases that I see,” O’Callaghan said.

Cohen, addressing facial recognition as potentially “a great tool for security,” but also “a great means of controlling the population,” prompted a discussion during which the panelists agreed “regulation” is necessary

Such regulation can, in Cohen’s words, “address these threats that technology represents, with regulation and with agreements across governments.”

According to Bozentko, regulation also presents “a unique situation, where applying particular engagement tools might provide an opportunity to really proactively address some of what we perceive as the harms of social media, of technology, as it has impacted democracy in recent years.”

This, said Bozentko, necessitates “really finding a way to get in front of and build a framework both for metaverse experiences internally and frameworks externally that involve the public meaningfully in shaping these issues,” adding, “The public should be involved in shaping what it looks like and how it feels and how it’s governed.”

How can this be done? Bozentko referred to “public-private partnerships around artificial intelligence and data-sharing platforms and monetization,” which “has to be done carefully and it has to be done independently so that its results and feedback aren’t subject to skepticism and criticisms of bias.”

“The role that citizens and the public can play is to be brought into those conversations early so that their input and insights can be meaningfully incorporated into the earlier stages of development rather than trying to retrofit, which we’re already seeing isn’t working,” Bozentko added.

In terms of accomplishing this, Nagore said that “time is really on our side because … the full materialization of this [the metaverse] is still years ahead.”

As previously reported by The Defender, a May 2022 article on the WEF website proposed Facebook’s “Oversight Board” as an example of a “real-world governance model” that can be applied to governance in the metaverse.

However, some panelists promoted self-regulation as a model to follow. Linde, for instance, said, “Regulators are calling upon us [private companies] to help with this challenge,” adding that “increasingly, big companies like Microsoft and also others have self-regulated and put [in] some ambitious targets.”

On data protection, however, panelists were much more vague in their statements.

For Bozentko, data protection “really comes down to a case-by-case basis of individuals, how comfortable you are, how familiar you are with what you’re using and how you aim to use it.”

Nagore, in turn, said she “would agree, obviously, on informed consent,” but “also on the fact that this may look different for everybody.”

‘Online echo chambers’ are ‘diminishing mainstream media,’ journalists say

Another forum panel, “Rethinking Media: Truthsayers and Naysayers,” facilitated a discussion that attacked head-on purported “misinformation” and “disinformation” — and social media’s role in disseminating them.

Participants included:

  • Anna Romandash, journalist, Ukraine
  • Stephen King, CEO, Luminate
  • Khadija Patel, chair, International Press Institute
  • Donald Martin, media consultant and former editor of The Herald, Scotland
  • Steven Erlanger, chief diplomatic correspondent, Europe, The New York Times, who moderated the discussion

The session opened with statements from university students about the importance of digital governance.

For instance, Chui Kai Shun of Lingnan University in Hong Kong stated that “Taiwan provides a great example of incorporating the tech community into information warfare,” by monitoring “trends of this information” and injecting “dispatches from independent news sources right before this information spreads widely.”

How government-promoted news can nevertheless be classified as “independent” was not clarified.

Erlanger, addressing the spread of so-called “fake news” on the internet, referred to a phrase he attributed to former New York Sen. Daniel Patrick Moynihan, “you may be entitled to your own opinions, but you’re not entitled to your own facts,” and to a statement made to him by a French minister that, “The problem is democracy depends on the distinction between truth and lies.”

King said social media companies have had “a very distorting effect on democratic discourse” over the past 10 years, as social media “can drive polarization, can develop echo chambers, which we have to be very conscious of in terms of how people consume and get their information.”

As an example, King referred to “videos which are out on Facebook and on YouTube which are spreading mis- and disinformation and which are not being taken down.”

King did not mention the numerous instances of social media censorship on those — and other — platforms in recent years.

For Martin, while “fake news is not new,” it is “this scale that’s unprecedented,” adding that “it’s really frightening how quickly ‘fake news’ gains traction and acceptance, and that’s thanks largely to social media algorithms.”

“Fake news” needs to be “debunked within about 30 minutes, before it has traction,” he said.

Martin also appeared to blame the public at large for its choice of what news to consume. He said:

“I don’t think that the public know the extent of misinformation and disinformation.

“And often, I would say [they are] an unquestioning audience. They seem to be happy to be trapped in their own echo chambers.

“I don’t think many of them know how and why ‘fake news’ is generated and by whom. And worryingly, I don’t know that a lot of them care. And I don’t think they understand, many of them, the damaging influence of unregulated social media. And we need the public to help drive the change that’s required.”

The impact of social media on more traditional news sources also was addressed. In Martin’s view, the “overuse of social media” is “a major concern,” “because what it’s actually doing is diminishing mainstream media.”

Romandash expressed a similar view, stating that “the majority of the people who are not social media experts, who may not have strong digital literacy skills, the majority of those people will be just lost.”

“They wouldn’t be able to tell if something is true or fake,” she added, “which means that it’s very easy for truth to just be ignored or forgotten.”

This results in a “battle of narratives,” according to Romandash.

Such a “battle” occurs, according to Erlanger, because “social media doesn’t really have editors.” Instead, “people put out stuff, there’s no one mediating, no one asking the question, ‘Is this true? Is it not true? How do you back it up?”

For Erlanger, the “freedom” that social media companies have enjoyed since their inception “doesn’t work anymore.” He used the utilities industry as an analogy:

“In the beginning, the social media companies argued they were like utilities, right? I mean, they were just the highway over which these cars traveled and they weren’t responsible for the cars. They just provided the highway.

“It’s like the electric company just provides electricity. What you do with your electricity. Hey, that’s all up to you. That’s freedom. But that doesn’t work anymore. And of course, we regulate electric companies and we regulate utilities.”

Erlanger expressed a need for this sort of regulation on social media, but questioned how it can be done with billions of users, with “half the world … somehow attached to Facebook.”

“How does a government actually begin to do this seriously?” he questioned.

Romandash said social media companies claim they have “too much data, too many users,” which makes it “too difficult to monitor” their platforms. She described this as “ironic, because they actually have more resources than a lot of governments and they could definitely do more.”

Romandash, similarly to Bozentko’s proposal for more public-private partnerships, put forth the possibility of partnerships “between legal NGOs [non-governmental organizations] and journalists, where different organizations basically share your expertise.”

She said that in Ukraine, for instance, “journalists even are training prosecutors on how to work digitally, how to verify information online.”

For Erlanger, though, another quote — this one from Mark Twain — describes the “problem” of “fake news” spreading online: “A lie is already halfway around the world before the truth puts on its trousers.” And here, he blamed public tastes, in part.

“So part of our problem is regulation,” he said. “Part of it is just human nature and the love of a good tale.”

‘Deliberative Democracy’ panel: Elections aren’t how people want to express themselves anymore

Members of the “Harnessing the Tools of Deliberative Democracy” panel were similarly dismissive about the democratic process.

Participants included:

  • Leonidas Christopoulos, secretary general of Digital Governance and Simplification of Procedures, Greek Ministry of Digital Governance
  • Dawn Nakagawa, executive vice president, Berggruen Institute
  • Noa Landau, deputy editor-in-chief, Haaretz, who moderated the discussion

Nakagawa told the panel, “People are actually very politically engaged by not showing up at elections” as “that’s not how they want to express themselves politically. That’s not a form of engagement that they think suits their aspiration.”

Nakagawa did not address the possibility that at least some voters feel disenchanted with their choices, not necessarily with the electoral procedure itself.

Digital tools, however, allow people to change the manner through which they participate in the political system.

As a result, she argued, “It’s up to us to have the courage to actually change our political system in a way that allows them to participate in the ways that they’ve come to know. And that is about digital tools.”

“I believe, of course, we want people to come together in person and so on as well. We don’t want to completely discount that,” she said.

Similarly, Christopoulos expressed concerns over “issues having to do with fake news, with populism, with the rise of populism and a general, a negative, let’s say, stance towards the democratic process that we’ve seen throughout the last years.”

This can be overcome, Christopoulos argued, by, “at least in Europe, trying to regulate the whole process, trying to be able to find the mechanisms to be able to support valid dissemination of information in order to support the democratic process.”

The panel did not address how “valid dissemination of information” would be determined.

“We need to have a reskilling and upskilling agenda for citizens and employees,” Christopoulos added, mirroring a U.K .government advertising campaign in 2020 suggesting that ballerinas and other arts and culture industry workers who found themselves out of work due to COVID-19 restrictions, could “reskill” to find tech jobs.

This statement also resembles the “learn to code” meme that has circulated among largely “alt-right” social media accounts, which mocked out-of-work journalists, suggesting they obtain coding skills to find employment.