Background
The Tech and Journalism Crisis and Emergency Mechanism, an initiative launched by GFMD, seeks to strengthen content and account moderation systems by establishing an emergency and crisis mechanism for public interest journalistic organisations, thereby providing safeguards for online freedom of expression, which is also an important component of disinformation responses.
By holding this consultation, GFMD and UCLA ITLP aimed to engage local and international civil society groups, journalism and media organisations, and companies to discuss:
- Crisis and emergency protocols, case escalation criteria, and functioning escalation channels;
- Processes and criteria for identification of credible and trusted journalism actors online, their communities and representative groups;
- Key elements and processes for establishing a voluntary multistakeholder emergency and crisis mechanism.
Experiences of journalism and media organisations in Ukraine and the region
As a pilot project for possible broader application, the T&JM is initially focused on Ukraine and the neighbouring countries, targeting small and medium-sized media, community, and investigative journalism organisations.
In the first segment of the consultation, representatives of Ukrainian media and journalism support organisations shared their experiences about the issue of content moderation concerning content about war and human rights violations in the country since Russia’s invasion in February 2022.
In Ukraine and other conflict zones, journalism and media organizations face unique challenges when it comes to content moderation on social media platforms. Social media platforms seem to lack the capacity for effectively and promptly dealing with content moderation challenges and media organisations frequently experience delays and a lack of follow-up from platforms after flagging moderation issues.
Olga Myrovych (Lviv Media Forum) and Tetiana Avdieieva (Digital Security Lab Ukraine) are sharing their perspectives on experiences of #journalism and #media organisations in the region.
Lviv Media Forum’s CEO Olga Myrovych spoke about efforts to reach out to both Google and Meta in order to address content moderation on the platforms during the first months of Russia’s invasion of Ukraine. According to her, it used to take seven to ten days to obtain a response from the platforms. It presents a significant problem because, especially during wartime, news content becomes outdated much more quickly.
Some platforms remove graphic content related to war and potential violations of human rights. Media outlets often practice self-censorship to avoid being banned or to prevent their accounts from being suspended. There is a need to ensure that content about war crimes and human rights violations is not lost.
“As platforms try to create a joyful, happy environment on social networks, content related to war does not fit this ideal environment,” said Olga Myrovych.
Navigating these challenges requires a collaborative effort between social media platforms, media organisations, and local actors to establish clear protocols and support for fair and effective content moderation practices. By understanding key actors and sources of information, companies can better respond to crises and ensure that valuable information is preserved.
Multistakeholder mechanisms that involve local communities also play an important role in providing information on the trustworthiness of media outlets and, therefore, their content. Additionally, the mechanism should encourage stronger coordination among different platforms.
Crisis and emergency protocols, escalation channels, and mechanisms for identification and verification
Crises can be defined in various ways, including natural disasters. The distinction between a continuing war or an invasion is that they present peaks in communication issues at different times.
Many existing initiatives that deal with content moderation in situations of crisis have the goal of establishing automated moderation that aims to take the burden off individuals to review and deal with every single piece of content.
During the T&JM consultation, it was pointed out several times that companies should have a cross-functional team that can work across issues and areas, and are able to work both externally and internally, to be able to make decisions on both content issues and future directions a crisis may take.
Processes and criteria for identification of credible and trusted journalism actors online
Certifying content can be an extremely challenging activity, which requires protocols and frameworks that are carefully designed to ensure that freedom of expression is protected and that hate speech and illegal content are tackled.
Further problems may arise because, as EJN’s Program Specialist Danica Ilic highlighted:
“The regulation of hate speech and illegal content on platforms is sometimes beyond the capacity of industry mechanisms because many issues stem from structural problems in society to misogyny, homophobia, or racism”.
To address this, EJN recommends ethical audits to media outlets as a self-assessment procedure to evaluate their ethical standards and governance.
Danica Ilic (EJN) and Thibaut Bruttin (RSF)
contributed to the discussion about identifying credible and trusted journalism actors online.
During a crisis, it is difficult for media outlets to help formulate frameworks or mechanisms. To address this, Reporters Without Borders (RSF) and the Journalism Trust Initiative (JTI) propose defining a framework in the early stages of a crisis or even before a crisis. All stakeholders involved should agree on the adoption of a ready-to-use framework as soon as media outlets are in a crisis situation. A network of nominating sources, such as national press freedom organizations or councils, should be used to nominate media outlets that should be encompassed in this emergency protocol.
Certifying “content” can be challenging. Both the solutions presented by Ilic and Bruttin highlight the importance of focussing on the media outlet itself and its practices.
Key elements of establishing a voluntary multistakeholder mechanism
During the last segment of the consultation, ICFJ’s Deputy Vice President and Global Director of Research Julie Posetti mentioned the importance of the development of an early warning system with a multistakeholder approach, as advocated in a report by ICFJ and UNESCO. Such a mechanism should be integrated into the UN Plan of Action on Safety of Journalists and its national plans of action, as the problem directly affects press freedom and the safety of journalists.
Another key element for the establishment of the mechanism has been identified in relation to a more efficacious process of data gathering. This would provide digital platforms with evidence that content moderation issues are not isolated cases, but rather recurring issues, and encourage them to take more serious actions. Strong cooperation among governments, civil society, and tech platforms – with the latter’s non-responsiveness often being the main cause of content moderation issues – is crucial for the development of such a mechanism.
To find out first about GFMD’s latest media support activities join our growing community by signing up for MediaDev Insider – which is read by over 3,000 leaders in media development and journalism support all over the world!