"When our environment is made up of information that polarises, that enrages, it leads to a loss of trust in each other. This version of Facebook is tearing our society apart and causing violence in the world."
This is one of the recent accusations made by whistleblower Frances Haugen. Last autumn, she had leaked tens of thousand of pages of internal studies, presentations and memos revealing what is going on to get Facebook more regulated and forced to change its algorithms. Jennifer Hansen, Berlin
In the beginning of this year, another former Facebook employee entered the stage, Brandon Silverman. He had quit Facebook in October after having worked there since it acquired his start-up, CrowdTangle in 2016. Start-up tracked hyperpartisan content CrowdTangle tracked the content that draws attention on Facebook and evolved as the most important window into what was actually happening on the platform. Its data increasingly told a story Facebook did not like – revealing for example the extent to which Facebook users engaged with hyperpartisan right-wing politics and misleading health information. Currently, Facebook and Instagram’s algorithms reward interaction to make users stay on the platform. As extreme opinions generate more interaction, they are automatically promoted by the algorithms. To encourage interactions, content is played out to audiences of the opposite position using psychograms of users that are created for this purpose. And ads with extreme perspectives are cheaper when engagement is higher. Facebook knows about the problems Facebook knows about the problems caused by its algorithms, but is ignoring warnings from its employees about the risks of their design decisions. According to Brandon Silverman, Facebook is even trying to get rid of teams and tools that are showing the real picture. There are technical solutions to intercept hate, agitation, and false statements, but they are not implemented for profit reasons. Instead, Facebook continues to expose vulnerable communities around the world to a cocktail of dangerous content. Platforms don´t depict reality The current design does not allow for constructive dialogues on and through the platform, but increases polarization and division. In addition, Facebook and similar platforms do not depict reality, but paint a rather distorted picture of society. This “false polarization” happens because more centrist users tend not to engage in online discussions around heated subjects. When looking at the arguments from the more extreme users that dominate the discussion, social media users of both sides conclude that the others are more extreme than they actually are. In the midst of this scandal, Facebook rebranded the company as Meta. The aim is to build the metaverse, the extension of the internet into three-dimensional virtual reality spaces. Zuckerberg emphasizes that the metaverse would facilitate “the most important experience of all: connecting with people.” That is something critiques say Meta does not manage too well currently - at least not to find common ground and joint solutions to the big challenges of our time. Design tweaks are needed The design of the current platforms has real-world affects – and so will the design of the future metaverse. So, what can be done so that Facebook and the future metaverse become part of the solution instead of being part of the problem? Pushes for immediate design tweaks at major platforms are important, but certainly not enough. Efforts at multiple points of leverage are needed to have an impact on the systemic challenge. This systemic point of view is being promoted for example by the Centre for Humane Technology with their framework of leverage points for how to intervene in the tech ecosystem. One of the leverage points they have identified is regulation, something Mr. Silverman has spent the weeks since his exit working on with a bipartisan group of U.S. senators. Among other things, they have been working on how to force social media platforms to be more transparent. Help plattforms live up to promise “I think figuring out ways to both help and, in some cases, force, large platforms to be more transparent with news and civic content as it’s in the process of being disseminated can ultimately help make social platforms better homes for public discourse — and in a lot of ways, help them live up to a lot of their original promise.” – he told the New York Times in a recent interview. interview at the beginning of the year. Jan Böhmermann (a German satirist responsible for the weekly late-night show Magazin Royale) goes one step further in his song: “Expropriate and split up Facebook! Put it under state control! Socialize the Social Network! For the sake of all!”
Exaggerated for the sake of entertainment, there might be some truth in this proposition as well: Rethinking business models and the goal business should ultimately pursue is another leverage point identified by the Centre for Humane Technology. The role companies play in society is currently being discussed not only for tech companies, but more broadly. Voices are growing louder that say we are in the midst of a fundamental transformation of the economy. There are several movements worldwide currently working on the different leverage points. If they continue gaining more attention, support, speed and power, the tech ecosystem could really be converted into one that is making positive contributions to societies worldwide, one that enables thriving 21st-century digital societies. More on Facebook, tech platforms, society and solutions The Facebook Defense: It’s up to you to make decisions on FB... Here Nick Clegg, VP Global Affairs at Meta and former deputy prime minister of UK, sums up criticism and makes the company's case in a recent blogpost: “…ultimately, content ranking is a dynamic partnership between people and algorithms. On Facebook, it takes two to tango”. But systemic thinkers say Facebook is more powerful than cultures, markets, and governments Daniel Schmachtenberger, Consilience Project, discusses the negative impact of tech and social media companies with Tristan Harris from Center for Humane Technology. The infotech platforms have captured culture, markets and governments. “…you can’t have a regulatory body regulate something that is both more powerful than it and growing faster than it”. “…Big Tech is directly decreasing the coherence of the government’s capacity to govern it.”
Try system thinking? Fika’s Pierre Golbach recommends a free system thinking class by Acumen Academy. - It’s well guided and you don´t need previous knowledge. Just dive right in, ideally with a group of unlike-minded people to enrich your discussions, says Pierre.
Σχόλια