« Back to all News

How do you solve a problem like Facebook?

Author: Communications Gfmd | 1. December 2021

Facebook has been repeatedly criticized for its role in propagating false claims, conspiracy theories and allowing spreading dangerous misinformation, especially during the global spread of Covid-19, without reacting respectively. Is this problem a bit more than just purely technical? Can Facebook fix itself, or must regulators step in? Those were the questions that were brought up during the webinar “How do you solve a problem like Facebook?” that was held by the Bureau of Investigative Journalism and the Ethical Journalism Network on November, 30.

To make it clear what is considered misinformation, Sophie Zhang, data scientist and Facebook whistleblower, explained the difference between misinformation and inauthenticity on the example with cats. If a person goes on social media and says “Cats are the same species as dogs”, it doesn’t matter who the person is, what that person is saying is wrong. In contrast, when it comes to inauthenticity, it’s important who the person is. If a person will set up a fake account and write “Cats are adorable”, Facebook will take it down not because of the message they disagree with but because of the fake personality of the user.

And yet, as Sophie Zhang pointed out, sometimes that is the way out for people whose freedom of speech no one defends so they use inauthentic voice to protect it. According to data scientist, Facebook naturally prioritizes the wrong area. Algorithms of the social media network usually find out only those fake accounts that are particularly popular among a wide audience while the most successful fake accounts are mostly the ones that don’t attract too much attention. Information researcher Nina Jankowicz also emphasized the meaning of anonymity for the freedom of speech. “If we remove anonymity in authoritarian autocratic countries what we would essentially be doing is allowing countries like Russia crackdown on people for having inconvenient opinions”, she said.

Sophie also claimed the inability of Facebook to act quickly as in the case of reacting to fake accounts that were running to promote candidates during Presidential races in the USA. Will Moy, CEO of FullFact, UK’s independent fact-checking organization which is collaborating with Facebook, explained the issue from two points of view. On the one hand, Facebook is doing more than many of its competitors. For instance, they have a network of independent fact-checking partners all over the world. During the pandemic, they were among the first social media platforms to recruit medical misinformation specialists. On the other hand, however, remarks Will Moy, nobody could credibly say that Facebook is doing enough on misinformation. During the pandemic, the company was taking down content with a clear downside effect on freedom of speech pressured to do so politically. So although Facebook is not doing enough, they are also doing too much.

“Companies are overreaching in what they taking down and not only during the pandemic. They are making the decisions that should be made by democratically elected governments. There is a business reason for it, it isn’t a purely technical problem”, said Will Moy.

One of the questions asked by the audience member was “What can we do as users to help?” During the “infodemic” a lot of misinformation is often shared with the originally good intention of helping family and friends when in fact it could only bring further harm. To prevent it, according to the CEO of FullFact, it is important to think thoroughly before sharing any information and ask yourself three questions:

  1. Where is the information from?
  2. What’s missing?
  3. How do you feel about it?

Such a responsible approach will stop a user from voluntarily spreading wrong and harmful information through social media channels and already will be of great help. Talking about the interaction between the users and social media platforms, Imran Ahmed, CEO of the Centre for Countering Digital Hate, highlighted the importance of transforming the way we perceive social media and realizing the real cost of using it:

“We need to start to change the way we see these platforms. These aren’t free speech platforms, they aren’t platforms that are there for the public good, they are commercial.”

The panel has ended with a round-up of policy proposals from the speakers. As social media fragmenting separate audiences what makes it easier for propaganda and misinformation to spread, it is crucial to “build a shared reality to make democracy happen”. The suggestion is to require Facebook to keep their public policy more transparent, in particular when it comes to data access for the researchers, and gradually build a trustworthy body of law that will regulate important decisions made by big tech companies.

Secure The Future Of Journalism – Donate To GFMD 

We believe that independent media and professional journalism are essential pillars of democracy, human rights, and sustainable development.

Support GFMD’s global mission to strengthen journalism where it matters most.

Your contribution helps us:

  • Advocate for press freedom – Shape global media policy and defend journalists’ rights
  • Assist journalists in crisis – Provide emergency support and resources
  • Coordinate global media networks – Connect and support our 200+ member organisations
  • Promote public-interest media – Ensure journalism serves communities, not corporations
  • Produce actionable research – Generate evidence-based insights that guide media policy and strengthen media sustainability

We don’t accept Big Tech funding, and less than 1% of global donors support our work. That’s why your support is vital.

Every contribution-big or small-helps! 

If you want to support the future of independent journalism, donate via PayPal here:

Donate

Search

You are using an outdated browser which can not show modern web content.

We suggest you download Chrome or Firefox.