3. Fostering societal resilience to disinformation

Countering disinformation and strengthening information integrity require concerted efforts to build societal resilience. Broadly, resilience is about addressing the root causes of crises while strengthening the capacities and resources of a system to cope with risks, stresses, and shocks (OECD, 2023[1]). Applied to tackling disinformation and strengthening information integrity, resilience refers to a society’s ability to understand, resist and recover from threats within the information space. Indeed, several countries have situated societal resilience to information threats as part of building a comprehensive or total defence system, in which every individual and organisation should play a role, including as checks and balances in the overall information ecosystem.

On the one hand, therefore, individuals need skills and knowledge to navigate the information space effectively and responsibly. Government investments in digital, media and information literacy – and efforts to help ensure private companies actively contribute to societal resilience efforts – are important means to prepare and inoculate people against false and misleading content. According to PISA (Programme for International Student Assessment) results, in 2018 only 47% of 15-year-old students across OECD countries reported that they were taught how to detect whether information is subjective or biased at school (OECD, 2021[2]). A person that can navigate the information space responsibly will likely be more able to assess critically the content they encounter, to find higher-quality sources, to identify biases, and make well-informed decisions.

Furthermore, developing a public communication function removed from politicised goals to serve as a source for accurate and relevant information, and that is responsive to citizens in the service of the common good, is an important tool to build societal resilience. More broadly, the value of access to information as a key safeguard for democracy has become more evident in the past years. Various crises, ranging from financial to health to defence, have increased the need and demand for accurate information from government itself (OECD, 2022[3]).

At the same time, fostering resilience to disinformation will require governments to strengthen public engagement mechanisms on topics related to information integrity as part of the larger undertaking to reinforce democracy and build trust. Engagement with the public and non-governmental stakeholders should ultimately be guided by efforts to protect and strengthen civic space to foster more open, transparent, and accountable governance (OECD, 2022[3]). Expanding research and understanding of the information space (namely, the convergence of the public, communication technologies, amplification algorithms, and content), and ensuring the findings inform the policymaking process, will also be essential contributions (Wanless and Shapiro, 2022[4]). Governments should therefore focus on expanding the competencies, resources, and reach of efforts in this space to facilitate participation and understanding across all segments of society.

Together, these efforts compose what is often referred to as a whole-of-society approach. That said, an effective whole-of-society approach also requires protecting the human rights of those targeted by disinformation. It also requires promoting civic education, as well as clarifying processes, expected outcomes, and mechanisms both to mitigate potential risks and to take full advantage of the opportunities to engage with the public and non-governmental stakeholders. For example, the Netherlands’ 2022 government-wide strategy for effectively tackling disinformation explicitly mentions the role of civil society and academics in fighting disinformation (Government of Netherlands, 2022[5]). Similarly, the 2023 Latvian counter-disinformation programme stresses the importance of government co-operation with stakeholders across society. The 2022 updated EU Code of Practice on Disinformation, furthermore, defines stronger and formalised roles for the fact-checking community, and the EU Digital Services Act creates obligations for online platforms and search engines to co-operate with fact checkers in the framework of Code of Practice (European Union, 2022[6]).

To reinforce societal resilience against the risks of mis- and disinformation and implement a whole-of-society approach, government efforts should focus on:

  • Strengthening media, information, and digital literacy skills

  • Helping ensure the public is well informed via proactive and public communication efforts removed from politicised goals, and

  • Strengthening public participation in information integrity efforts and building understanding of the information space.

A long-term and systemic effort to building societal resilience to the challenges posed by disinformation involves building media, digital, and information literacy to help ensure the public can participate in the information environment with a discerning view and critical approach. There are several definitions of what media, digital, and information literacy includes. For example, the EU's Audiovisual Media Services Directive (AVMSD) stipulates that media literacy refers to skills, knowledge and understanding that allow citizens to use media effectively and safely. Beyond learning about specific tools, technologies, and threats, media literacy more broadly aims to equip individuals with the critical thinking skills required to exercise judgment, analyse complex realities, and recognise the difference between opinion and fact (European Union, 2018[7]). The UK’s independent communications regulator, Ofcom, defines media literacy as “the ability to use, understand and create media and communications in a variety of contexts” (Ofcom, 2023[8]). UNESCO defines media and information literacy (MIL) as an effort to: “Empower people to think critically about information and use of digital tools. It helps people make informed choices about how they participate in peace building, equality, freedom of expression, dialogue, access to information, and sustainable development (UNESCO, 2023[9]).” Digital literacy, furthermore, focuses on the competencies needed to live and work in a society where communication and access to information increasingly takes place through digital technologies (OECD, 2022[10]).

A comprehensive understanding of media, information, and digital literacy focuses on the public’s skills related to accessing, analysing, evaluating, and creating content in a variety of contexts (Hill, 2022[11]). This range of skills includes both understanding the creation and distribution process, as well as developing the ability to take a critical approach to evaluating information reliability. Governments largely recognise the importance of building media and information literacy skills. Within Europe, the EU's Audiovisual Media Services Directive (AVMSD) (European Union, 2018[7]), which governs EU-wide co-ordination of national legislation on all audio-visual media, includes specific provisions requiring Member States to promote media literacy skills and to report on these actions, and obliges media service providers and video-sharing platforms to promote the development of media literacy and raise awareness of available media and digital literacy tools (European Commission, 2023[12]). The European Regulators Group for Audiovisual Media Services, furthermore, is tasked with exchanging experience and best practices on the application of the regulatory framework for audiovisual media services, including on accessibility and media literacy. As of 2022, in the United States, 18 states have passed legislation requiring education agencies to develop and include media literacy curricula in schools (Media literacy now, 2022[13]).

Taken together, governments should prioritise the following elements when considering how media and information literacy initiatives best fit into broader efforts to build societal resilience:

  • Media and information literacy initiatives should be seen as part of a wider effort to reinforce information integrity, including by incorporating such efforts into official curricula and reaching individuals of all ages in relevant efforts

  • Pro-active public communication efforts, or “pre-bunking,” can be useful media and information literacy tools to help build societal resilience

  • Assessing and measuring impact of media and information literacy activities.

The aim of the media, information, and digital literacy initiatives largely focuses on efforts to give people the tools to make conscious choices online, identify what is trustworthy, and understand platforms’ systems in order to use them for their own benefit (Forum on Information and Democracy, 2023[14]). Media and information literacy should be part of a larger approach to building digital literacy, for example by focusing on elements related to addressing how algorithm recommendation systems and generative AI work, as well as civic education, for example by teaching the importance of democratic principles and processes and focusing both on school-aged individuals, as well as adults and seniors.

Ultimately, media literacy initiatives are most relevant to the extent that they reinforce broader objectives related to strengthening information integrity. For example, Portugal’s National Plan for Media Literacy highlights that media literacy is a fundamental element for the defence of freedom of expression and information, and is essential to enabling democratic participation and the “realisation of economic, social, cultural and environmental rights (Government of Portugal, 2017[15]).” A notable component of Finland’s approach, furthermore, is that their focus on media literacy has long been perceived as part of a wider effort to build societal resilience to disinformation. Media education initiatives have been present in Finnish schools since the 1950s, and the country has focused its media education efforts on promoting people’s willingness and ability to consume, use and share information in a responsible way, and, ultimately, contribute to citizens’ active participation in society (see Box ‎3.1).

In some OECD Member countries, media and information literacy is centrally co-ordinated, for example by the National Audio-visual Institute, KAVI, in Finland; the Centre de liaison de l'enseignement et des médias d'information, CLEMI, in France (see Box ‎3.2); or the National Media Regulatory Authority (ALIA) in Luxembourg, which co-ordinates media literacy activities with relevant national and European stakeholders. In Portugal, the Regulatory Authority for the Media has helped facilitate media literacy by mapping the range of existing interventions to promote and develop this space in the country (Portuguese Regulatory Authority for the Media, 2023[17]). In other countries, the responsibilities are spread across different institutions, such as ministries of education, other line ministries or national regulatory authorities.

The most common approach is for countries to provide media literacy within schools (see the example from Estonia in Box ‎3.3), either via a separate curriculum specifically devoted to media and information literacy or included within other topics (for example, language, mathematics, history, citizenship). In Portugal, for example, the curriculum integrates media literacy via citizenship and information and communication technology sections. The country’s Guidelines for Media Education (Referencial para a Educação para os Media), updated in December 2023, underline that media literacy is interdisciplinary and should be reinforced across learning areas, as well through projects with the National Network of School Libraries and with external organisations.

OECD countries also produce manuals and guidebooks on understanding and counteracting the threat of mis- and disinformation. These are distributed on official websites and in print, to be shared with schools and public libraries. For example, in 2022, the Latvian State Chancellery published a digital book entitled “Handbook against disinformation: recognise and oppose” (Rokasgrāmata pret dezinformāciju: atpazīt un pretoties)1. The manual summarises practical recommendations for state administration and local government workers, as well as all Latvian residents, to address information manipulation. The manual is distributed to libraries throughout the country. The Ministry of Interior in the Netherlands, for its part, finances the creation and operations of the website “Is that really so?”,2 which teaches the population how to identify mis- and disinformation.

Media and information literacy activities are often developed and implemented in partnership with a wide range of civil society organisations. The tendency toward this whole-of-society approach is borne out by the amount of CSOs, media and other organisations working in this field. For example, the United Kingdom identified at least 175 organisations focused on media literacy and in Finland, KAVI has identified almost 100 organisations promoting media literacy. For its part, the Norwegian Media Authority has established a media literacy network to provide a forum for organisations representing researchers, businesses, civil society organisations and governmental bodies to share information and identify priority issues to address. In the Netherlands, furthermore, the Dutch Media Literacy Network connects close to 1 000 non-governmental organisations (see Box ‎3.4).

Governments also often partner with non-government organisations to provide media literacy initiatives, where CSOs and governments work together to prepare campaigns, informational and study materials, gamified solutions, and training videos. In Norway, the campaign “Stopp.Tenk.Sjekk” (Stop, Think, Check) was developed before the 2021 elections and is a co-operation between the Norwegian Media Authority and the fact-checking service Faktisk.no, the National Association of Local Newspapers, the Norwegian Directorate for Civil Protection (DSB), and with support from Meta. The campaign recommends six questions for individuals to ask themselves when reading content online, with the aim of helping people think critically about whether an article, post, or piece of news is trustworthy. A new version of the campaign was created concerning Ukraine in 2022, as well as prior to the 2023 elections (Norwegian Media Authority, 2021[20]). Similarly, the Be Media Smart campaign in Ireland flags the importance of knowing how to verify information, provides tips and guidance on how to check the accuracy and reliability of information, and provides information on sources of support and training (see Box ‎3.5).

Another co-operation format is “media literacy weeks”, such as those organised by UNESCO, across the European Union, and in several countries. In Finland, for example, every year around 30 different materials or campaigns are created in co-operation with more than 50 partner organisations from all sectors of society, including public institutions, NGOs, and private companies (Media Literacy Week, 2023[23]).

Media and information literacy activities may also include efforts to better understand and reach groups susceptible to mis- and disinformation, but that are not reached by more traditional initiatives, such as older populations, diasporas and second-language communities, socioeconomically disadvantaged groups, people with disabilities, and migrants. For their part, older populations often have weaker digital skills and are more prone to sharing mis- and disinformation compared to younger cohorts of the population (Guess, Nagler and Tucker, 2019[24]). Efforts to reach these group include projects devoted to media literacy of retired people through seniors’ centres, public libraries, and other community settings. For example, the Norwegian Media Authority worked with the non-governmental organisation Seniornet to develop educational resources for seniors, including printed booklets, presentations, and in-person meetings that build media and digital literacy within that population.

Other vulnerable groups that media and information literacy activities target include diasporas and second-language communities. To that end, Baltic states have designed specific media literacy campaigns to reach Russian speakers, such as the Latvian government’s project it has carried out with the CSO Baltic Centre for Media Excellence. In addition to working through schools, therefore, governments should identify approaches to expand media and information literacy activities to particular groups of the population that traditional programmes might not otherwise reach (see Box ‎3.6 for examples from the United Kingdom).

Governments can also help prepare society to better understand disinformation flows and risks by “inoculating” the public to the potential harms. These “pre-bunking” efforts seek to “warn people of the possibility of being exposed to manipulative disinformation, combined with training them on how to counter-argue if they do encounter it,” with the idea that such activities will reduce their susceptibility to false and misleading content (Roozenbeek and van der Linden, 2021[26]) (Van der Linden, 2023[27]). Pre-bunking and other pro-active communication efforts can focus on flagging disinformation actors, sources of inauthentic information, and on assessments and insight into tactics used to create and share misleading content (OECD, 2023[28]).

To this end, governments have created and distributed materials and organised internet campaigns that inform the public about the dangers of mis- and disinformation, name-and-shame malign actors, and share examples of how information attacks and false narratives can spread. Lithuania, Latvia, Estonia, Finland, Czechia, and others, notably through their intelligence agencies have in recent years started to publicly disseminate analytical reports and threat assessments. These often devote considerable attention to the information environment, including malign actors, examples of relevant attacks and manipulations, and target audiences. Such reports provide the public with reliable information on the major threats (see Box ‎3.7).

Based on these assessments, governments have also organised special courses for representatives of civil society, media, academics, business on national security and defence topics. The content of these courses includes information on threats, as well as opportunities to discuss the issues with government officials. Such efforts support societal resilience by raising participants’ awareness of threats and preparing them for co-operation in the case of a crisis. Beyond raising awareness, the benefits of such endeavours help participants serve as ambassadors to spread the understanding and skills to members of their respective organisations and the public.

Another practical example of a public and accessible pre-bunking tool is the development of the GoViral! Game, created by a collaboration between academic researchers, the UK Cabinet Office, the World Health Organisation and three private sector design agencies. The game exposes players to manipulation techniques and simulates real-world social media dynamics to share insights into of how mis- and disinformation are spread (see Box ‎3.8). A strength of these pre-bunking efforts is that while they inform the public of actual disinformation threats and techniques, they do not put governments in the position of discussing specific pieces of content or serving as an arbiter of truth.

Despite the general agreement on the necessity and value of providing media and information literacy skills, several challenges exist. First, the effectiveness of media literacy activities is heavily dependent on the capacity of teachers and trainers, as well as the quality of available materials. One way to help ensure consistent implementation of MIL activities, therefore, is for countries to establish a system of teacher training. Notably, the French centre “CLEMI” trains around 17 000 teachers on media and information literacy each year (CLEMI, 2023[35]). The consistency of training through the school system may also be hindered in countries with less centralised education system. Such systems may also enable greater innovation and experimentation, though it can lead to variable quality between approaches.

Attention should also be given to the quality of partners conducting MIL activities that are funded in whole or in part by the state. Given the range of potential actors, quality control, monitoring, and cost / benefit assessments are essential, despite adding administrative costs. Particularly where partners are providing media literacy campaigns, governments should put in place effective mechanisms to ensure the content, methods, and quality of products fit general requirements and that the activities align with strategic goals.

Another challenge that all MIL efforts face is related to difficulties in assessing and measuring impact of the activities. Formal measurement criteria usually involve obligations to report on outputs, such as a list of the events or other activities, the audience reached (for example, views on the website or social platform or the number of the participants in events), hours spent in trainings, and mentions of the project in the other media sources. Even if output measurements exist, such criteria do not often illustrate the actual impact of the project on its intended goals or broader changes over time in the capacity for critical and reflective information consumption. Without careful assessment, it is not clear how activities practically change participants’ attitudes or whether the effect is long-lasting. This challenge is magnified in less formal settings, where participation is not mandatory and may be less consistent.

Such issues point to the need for clear methodology for evaluating the effectiveness of MIL activities. The Council of Europe analysed 68 MIL projects in 2021 in the field of media literacy and found that one third of the projects did not include any measurement parameters (Council of Europe, 2016[36]). In the United Kingdom, the national online media literacy strategy explicitly stipulates the need for better measurement in this field. The analysis noted a “distinct lack of robust evaluation of media literacy provisions.” Where evaluation measures exist, they are often very limited, using metrics such as reach, number of events, quotes from participants, or participant self-assessments, making it challenging to assess whether provisions are effective at improving media literacy capabilities on a long-term basis (UK Department for Digital, Culture, Media & Sport, 2021[37]).

Media literacy providers, furthermore, often do not have sufficient funding to be able to monitor and evaluate their initiatives. Relatedly, interventions also often operate on a short-term basis and do not facilitate working with the same beneficiaries over a long enough time frame to determine the effectiveness of the activities. To that end, many aspects of media literacy that are cemented in behavioural change can be difficult or impossible to measure over the short-term; for example, assessing whether users are able to independently apply learnings to the ‘real’ online environment, rather than just under supervision (UK Department for Digital, Culture, Media & Sport, 2021[37]).

For its part, the Norwegian Media Authority conducts an assessment every two years on the state of the media literacy in the country. The latest report was published in 2021 and is based on the representative opinion poll of 2048 Norwegian residents. Among its findings are that the oldest (aged 60+) and youngest (aged 16–24) segments of the population find it most difficult to deal with disinformation, and that while 50% of the population reports that they check other sources they trust to verify information, 18% note that they do not verify information at all (Norwegian Media Authority, 2021[38]). (see Box ‎3.9 for additional examples of media literacy assessment tools).

The challenges related to the costs, processes, and independence of assessing media, information, and digital literacy initiatives point to the opportunity provided by working with external partners and experts to provide independent perspectives. For example, the U.S. Department of State GEC supported the development of two web browser-based media and information literacy games. The University of Cambridge Social Decision-Making Lab independently assessed the efficacy of both games, which has enables the GEC to monitor the games’ efficacy and continue to make changes (Box ‎3.10).

A focus moving forward will be on developing methods for measuring impact of these initiatives as they relate to the public’s ability to take part constructively in the information space. This will require monitoring changes in broad indicators over time, such as susceptibility to mis- and disinformation narratives and trust in governmental communications and institutions. While direct causality is difficult (or impossible) to identify, these could be seen as possible pieces of evidence of success. Such analysis would be particularly relevant for large-scale projects that include considerable part of countries’ population. Indeed, greater emphasis on longitudinal impact evaluations would enable comparisons against baselines, highlighting changes over time in the capacity for individuals to critically and reflectively consume information.

Analysis could also be based on the monitoring of specific behaviour of the audiences targeted by a policy or project. For example, this could include analysis of online activity such as changes in patterns of sharing mis- and disinformation materials following MIL trainings. There are clear limitations for such activities, however, including the lack of transparency of social media platforms. Finally, measurement could include self-assessments of the target audience following interventions or activities, for example via questionnaires given to participants who took part in an MIL initiative.

A more immediate goal of whole-of-society efforts to strengthen societal resilience focuses on ensuring individuals are informed and aware of false and misleading content. In democratic settings where government information is open to scrutiny by free and independent media, the public communication function can play a crucial role in fostering societal resilience to disinformation. This is achieved by serving as a source of timely and relevant information. This function should aim to be distinct from political communication, which is linked to elections or political parties, political debates or the promotion of the government of the day. A modern public communication function should be understood as the government function to deliver information, listen, and respond to citizens in the service of the common good (OECD, 2021[44]). To that end, government efforts to build awareness and help ensure the public has access to information include the following avenues:

  • In democratic environments where government information can be challenged by free and independent press, timely information provided by governments can build awareness of relevant challenges and threats

  • Engagement with external partners, with appropriate governance models and within free and democratic contexts, can help build societal resilience to the spread of disinformation.

Information does not spread in a vacuum – traditional media and fact-checkers, online platforms, civil society, and individuals themselves are essential actors in generating and amplifying content. At the same time, governments, often via the public communication function of the centre of government or particular ministries, with other actors that constantly play a healthy checks and balance function, can help raise awareness of the spread of false and misleading content and serve as a source of accurate information. Even where facts are unclear or still being collected, as is often the case in crises, the public will demand updates; governments should consider how to anticipate and respond to individuals’ needs honestly, transparently, and with the best information possible, while pre-empting the spread of rumours and falsehoods (OECD, 2023[28]). The public communication function therefore requires advanced and sophisticated governance to safeguard its focus on delivering for the public good, promote disclosure of sources, ensure a level of separation from political communications, as well as to build its capacity and professionalism. The OECD has conducted a comparative analysis of good practices and drawn from these a set of Good Practice Principles for Public Communication Responses to Mis- and Disinformation (Box ‎3.11). In most OECD countries, this function remains undervalued and underutilised as a source of information, and is still transitioning away from a focus on political communication.

Similarly, the European Centre of Excellence for Countering Hybrid Threats stressed the importance of rapidly refuting lies and debunking disinformation, the necessity of working with civil society, ensuring that the relevant teams within governments are in place, undermining foreign malign actors through humour and accessible messages, and learning from and supporting partners as best practices in countering disinformation threats. Many of the lessons drawn from government and civil society responses in Ukraine to Russian disinformation can provide important lessons for effective strategic communication efforts moving forward (Kalenský and Osadchuk, 2024[45]).

Building capacity, establishing clear frameworks and institutional mechanisms, and formalising definitions, policies and approaches can help shift from ad-hoc and fragmented public communication approaches to counteracting mis- and disinformation, to more structured and strategic approaches (OECD, 2021[44]). Along those lines, for example, the UK Government Communication Service Propriety Guidance specifies that government communication should be: relevant to government responsibilities; objective and explanatory; not represented as party political; conducted in an economic and appropriate way; and able to justify the costs as an expenditure of public funds (Government of the UK, 2022[46]).

Public communication campaigns and government websites can debunk existing disinformation narratives. Delivering clear and tailored messages can help ensure communications reach all segments of society, including groups that are less likely to be exposed to or trust official sources. To that end, preparing and implementing strategic communication campaigns and ensuring accurate content reaches target audiences are essential in counteracting the spread of mis- and disinformation (OECD, 2023[28]). For instance, in New Zealand, the “Unstoppable Summer” campaign, including television advertisements and a short musical video featuring the Director General of Health, and shown before broad audience events, is a good example of an effort to reach youth (Government of New Zealand, 2020[47]) (OECD, 2023[48]). Indeed, throughout the COVID-19 response, many countries developed processes that utilised credible messengers, such as members of a particular community, scientists and doctors, or influencers to present relevant information in a timely, authoritative, and non-politicised way to help ensure it reached as wide a segment of the population as possible.

Given their sensitive role in creating and sharing content, as well as monitoring and responding to disinformation, governments should take extra precautions to ensure their communication activities do not lead to allegations or instances of politicisation and abuse of power. In the first instance, therefore, ensuring public communication strengthens information integrity depends on free information spaces and a strong and free media environment.

A lack of transparency around the activities of the public communication function can also undermine trust. Specifically, there is a risk that public communication initiatives designed to respond to disinformation can play into the arguments of actors who may accuse the government as playing “arbiter of truth” or even adopting disinformation techniques themselves. As a reaction to changing information consumption patterns, for example, governments have collaborated with online influencers to conduct awareness raising and other campaigns to reach segments of the population that they may not otherwise be well-suited to reach. While government engagement influencers via both paid and earned support can help strengthen the inclusiveness and reach of messages, putting in place clear guidelines, transparent processes, and independent oversight of the public communication function will help provide the necessary governance mechanisms to build trust (OECD, forthcoming[49]). More broadly, promoting access to information and open government standards, including publicly accessible open data, can help lower barriers for journalists and citizens to access public information and officials.

Beyond the public communication function, how governments engage with online platforms, civil society, media, and academics needs to be carefully considered. On the one hand, facilitating open lines of communication between actors can be a fast and efficient way to identify threats and promote better functioning information spaces (see Box ‎3.12). It can also be important for government institutions to receive direct updates from online platforms about the spread of mis- and disinformation, such as concerted amplification operations by hostile actors or those that threaten elections and the safety of the public. Furthermore, much of the work to counteract disinformation threats remains sensitive due to national security considerations; providing too much insight into what is known about foreign information threats or efforts to counteract them also risks compromising their efficacy (OECD, forthcoming[49]).

On the other hand, government interactions with online platforms, media, and other non-governmental actors in fighting mis- and disinformation are particularly sensitive given the risk that engagement with these external partners may enable governments to encourage content moderation beyond the formal regulatory power they have and infringe on freedom of expression.

Similar considerations point to the challenges of working with external partners to identify and debunk specific pieces of content. Notably, fact-checkers can be accused of political bias, and there is a risk that if fact-checkers receive direct funding or other support from governments, they will be pressured or incentivised (or perceived as being pressured or incentivised) to protect the government or smear political opponents. Research has found correlations between fact-checkers’ political affiliations and their priorities and findings (Louis-Sidois, 2022[50]). The risk of perceived (or actual) politicisation by fact-checkers can also be seen by findings from the United States that demonstrated that Americans are split in their views of fact-checkers: Half said fact-checking efforts by news outlets and other organisations tend to deal fairly with all sides, while about the same portion (48%) say they tend to favour one side (Pew Research, 2019[51]).

In 2023, Faktograf, a Croatian fact-checking outlet, published the preliminary results from a survey of 41 leading European fact-checking organisations that illustrates the potency of the polarised environment in which they are working. Their research found that 90% of the outlets reported having experienced some type of harassment. More than three-quarters – 36 out of 41 – of the fact-checking organisations surveyed have experienced harassment online, often facing verbal attacks. Furthermore, 70% of the respondents that experienced online harassment were subjected to campaigns that include prolonged or co-ordinated threatening behaviour, such as stalking, smear campaigns, “doxing”, and technology-facilitated gender-based violence, including gendered disinformation. Furthermore, 78% of the organisations confirmed that elected officials had targeted them directly (Faktograf, 2023[52]). In politically polarised environments, government engagement with these actors may risk amplifying risks and fuelling accusations of censorship and partisanship, harming both government and non-government actors in the process.

Self-regulation mechanisms put in place by media, CSOs, and other non-governmental actors involved in fact-checking and other relevant activities can help mitigate these challenges. In this regard, the active participation of media professionals can help ensure that journalistic expertise and ethical standards inform other relevant actions to promote information integrity. For instance, the International Fact-Checking Network (IFCN) has developed a code of principles signed by more than 200 fact checking organisations from around the world (IFCN, 2023[53]). Notably, IFCN signatory status may not be granted to organisations whose editorial work is controlled by the state, a political party or politician. It may, however, be granted to organisations that receive funding from state or political sources if the IFCN assessor determines there is clear and unambiguous separation of editorial control from state or political influence. Signatories also promise to be neutral and unbiased and commit to funding and organisational transparency. More detailed commitments are included in the “European Code of Standards for Independent Fact-Checking Organisations”, approved by the European Fact-Checking Standards Network Project (supported by the European Commission) in August 2022. The emphasis in this Code is devoted to political impartiality and transparency of organisations’ activities (EFCSN, 2022[54]).

Opportunities also exist for governments to be more transparent in their work with online platforms. For example, while decisions to take down content or add warning labels rest with the platforms themselves, governments may flag false or misleading content to platforms. In these cases, transparency around such discussions is critical and relevant disclosure mechanisms should be put in place (Full Fact, 2022[55]). Transparency around how and under what circumstances governments share information with online platforms can be an important way to strengthen public confidence that freedom of expression is upheld, while at the same time enable external scrutiny that such actions are necessary. In addition, governments could consider establishing independent oversight mechanisms to evaluate their actions in this space and ensure they do not limit freedom of expression (OECD, forthcoming[49]).

Building information integrity requires greater understanding of the specific problems that policy responses look to solve. As governments seek to strengthen their ability to counter threats posed by malign interference and disinformation, as well as reinforce the public’s ability to participate in well informed democratic debate more widely, they will need to build the understanding of what conditions within the information environment foster democracy and encourage active citizen participation (Wanless and Shapiro, 2022[4]). Working with the public and non-governmental partners to develop this understanding, build trust, and inform effective policymaking can ultimately serve as a catalyst for good governance and democracy.

Strengthening participation and engagement suggests the following entry points on which to build:

  • Participatory and deliberative democracy mechanisms can help establish policy priorities to strengthen information integrity.

  • Government-funded research on information integrity should be conducted with clear objectives and guardrails and inform the policymaking and implementation process.

Governments can also develop participation initiatives to facilitate engagement with the public, media professionals, platforms, academics and civil society organisations more widely on strengthening information integrity and countering mis- and disinformation. If structured well, such initiatives can help raise awareness and set a policy agenda that reflects public priorities while also building trust between individuals, media and decision makers. In a field such as information integrity, in which public scrutiny about government interference in the information space is, rightfully, important, and at a time of low trust in public institutions (OECD, 2022[56]), promoting civic education and involving citizens and various stakeholders in the design of these policies will be important.

Opportunities for citizens’ and stakeholders' participation and engagement are rooted in open and democratic governance and have multiplied significantly across OECD countries and beyond in the last decade. Indeed, the OECD Recommendation on Open Government notes that citizens should be provided “equal and fair opportunities to be informed and consulted and actively engaged in all phases of the policy-cycle,” and that “specific efforts should be dedicated to reaching out to the most relevant, vulnerable, underrepresented, or marginalised groups in society, while avoiding undue influence and policy capture (OECD, 2017[57]).” In this sense, the role of citizens refers to the public broadly, rather than the more restrictive sense of a legally recognised national of a state. Promoting the role of citizens and civil society means governments must create the conditions for the equitable, sustained, and substantive participation of civil society in policymaking (Forum on Information and Democracy, 2023[58]), and that countries should provide a level playing field by granting all stakeholders fair and equitable access to the development and implementation of public policies (OECD, 2010[59]).

Representative democracy, where citizen preferences are expressed through elected representatives, and direct democracy, where citizens vote on specific issues, are the most common avenues for participation. Beyond representation, promoting citizen participation should incorporate methods that provide the public with the time, information, and resources to discuss and deliberate, produce quality inputs, and develop individual or collective recommendations to support more open policy-making. For example, online calls for submissions, public consultations and roundtable discussions are all examples of participatory mechanisms. Furthermore, putting in place effective deliberative democracy mechanisms that bring together a representative group of people to discuss issues and feed a “representative” view into decision-making processes can lead to better policy outcomes, enable policy makers to make hard choices, and enhance trust between citizens and government (OECD, 2020[60]).3

To-date, engagement initiatives on topics of information integrity have been relatively limited, likely reflecting the need to continue to build understanding around the trends, processes, and clarity of potential policy responses. Nevertheless, while often characterised as a technical matter, identifying policy initiatives related to strengthening information integrity are largely understandable by, and of interest to, the public. Beyond academics and other stakeholders, such as media, CSOs, and the private sector, public consultations can help inform and support efforts to build information integrity.

In 2020, Ireland established the Future of Media Commission as an independent body to undertake a comprehensive and far-reaching examination of Ireland’s broadcast, print and online media. Notably, one of the recommendations of the report that was prepared by the Commission was for the government to create a National Counter-Disinformation Strategy (see Box ‎3.13), illustrating how public engagement can direct government actions and interventions. A similar example can be found in France with the organisation of the General Assembly on Information (les États généraux de l'information”), launched at the initiative of the President of the Republic in July 2023 with the aim of establishing a diagnosis of the key challenges related to the information space and proposing concrete actions that can be deployed at national, European, and international levels. The final output of this process, taking place between fall 2023 and summer 2024, will be to develop a set of proposals to anticipate future developments in the information space. Five working groups will develop these proposals, which will integrate feedback through citizens' assemblies and debates organised in-person in France as well as via an online consultation carried out by the French Economic, Social and Environmental Council (EESC).

In 2022, Spain created the "Forum against Disinformation Campaigns in the Field of National Security", a platform for public-private collaboration to promote debate and reflection on the risks posed by disinformation campaigns in the field of national security.

The complexity of policymaking around building information integrity and the need to respond to the challenges faced also point to the value of deliberative democracy initiatives as a promising tool. These refer to the “direct involvement of citizens in political decision making, beyond choosing representatives through elections”. Indeed, when conducted effectively, deliberative processes can lead to better policy outcomes, enable policymakers to make hard choices, and enhance trust between citizens and government (OECD, 2020[60]).

For example, the Canadian government worked with civil society organisations to organise three citizen assemblies on Democratic Expression, involving 90 Canadians who together contributed 6 000 volunteer hours to explore how the government should strengthen the information environment in which Canadians can freely express themselves. The Canadian Commission on Democratic Expression, in its report informed by the assemblies, recommended that the government should establish and independent Digital Services Regulator to set standards for the safe operation of digital services and to require platforms to conduct regular risk assessments. The Commission also recommended that the government appoint a special envoy to liaise at an international level on issues related to disinformation and foster dialogue with social media platforms, foreign governments, and multilateral bodies; promote interdisciplinary research on how content spreads; and to support media literacy efforts and invest in quality journalism at the national, regional and community levels (Citizens’ Assembly on Democratic Expression, 2022[62]). In addition to their use in informing policymaking, deliberative processes also help counteract polarisation and disinformation, as research suggests that deliberation can be an effective way to overcome ethnic, religious, or ideological divisions between groups (OECD, 2020[60]).

The aim of research in this space should be to better understand the conditions within the information environment that can foster healthy democratic societies and encourage active citizen participation (Wanless and Shapiro, 2022[4]). OECD members have responded to information threats in part by funding research activities to analyse trends, including the susceptibility to mis- and disinformation by different sectors of the population, content consumption patterns, and the threats posed by foreign actors producing and intentionally spreading false and misleading information. Governments are also supporting research to develop methodologies to assess the efficiency of various policy measures such as awareness campaigns and regulatory interventions. For example, Luxembourg financially supports the University of Luxembourg in its activities regarding the conduct of surveys for the European Media Pluralism Monitor and the “Local Media Project for Democracy”, in full accordance with the principles of academic freedom and scientific independence.

Internal research conducted by or for the government can play an important role in supporting a better-informed policymaking process, particularly if it involves access to sensitive, private, or classified data. For example, the Government of Canada, in partnership with the OECD and the French Government, conducted an experiment to investigate Canadians’ intentions to share different types of content on social media to better understand vulnerable populations and to design innovative policy solutions to mitigate the spread of misinformation (see Box ‎3.14).

Though governments may not disseminate the results of such research publicly, they can serve an important role in building understanding of the information space. Co-operation with external researchers to provide public outputs, on the other hand, allows governments to receive diverse insights and advice. Continuing to develop partnerships that are transparent, well-resourced, and that serve clear objectives will be important moving forward.

For example, Canada’s Digital Citizen Initiative focuses on helping Canadians understand online disinformation and its impact on Canadian society, and building the evidence base to identify possible actions and future policymaking in this space (see Box ‎3.15 and (Government of Canada, 2023[64]). In the Netherlands, the Ministry of the Interior and Kingdom Relations is one of the partners collaborating in the AI, Media and Democracy Lab, an alliance between the University of Amsterdam, the Amsterdam University of Applied Sciences, and the Research Institute for Mathematics & Computer Science in the Netherlands to work with media companies and cultural institutions to increase knowledge related to the development and application of generative AI tools (in 2022, the project received EUR 2.1 million).

European Union institutions also illustrate whole-of-society models for long-term funding for research projects related to fighting disinformation, notably during the funding cycle of the Horizon 2020 programme (European Comission, 2023[65]). Indeed, the fight against mis- and disinformation is one of the main priorities of current (2021-2027) funding round of “Horizon Europe” programme. For example, the EUR 7 million vera.ai project (2022-2025) connects 14 partner organisations, including the European Broadcasting Union, Deutsche Welle, as well as research institutes, universities, private companies, and the news agency AFP. Together, the consortium aims to help develop AI solutions that can help to unmask and neutralise advanced disinformation techniques (VERA.AI, 2023[66]).

Another important, though less direct, approach to supporting research is illustrated by the EU’s funding to the European Digital Media Observatory (EDMO), which connects civil sector organisations and academics for joint efforts to strengthen information integrity. The second phase of the project has funded the creation of national and multinational digital media research hubs across Europe with EUR 11 million through the Connecting Europe Facility. There are currently 14 regional EDMO hubs that cover the 27 EU member states and Norway. One of the most important strands of EDMO work is research activities focused on project mapping, supporting, and co-ordinating research activities on disinformation at the European level, including the creation and regular update of a global repository of peer-reviewed scientific articles on disinformation. Similarly, Canada has made a USD 4 million (CAD 5.5 million) investment to create the Canadian Digital Media Research Network (CDMRN), bringing together a range of Canadian research institutions, to further strengthen Canadians’ information resilience by researching how quality of information, including disinformation narratives, impacts Canadians’ attitudes and behaviours and by supporting strategies for Canadians’ digital literacy.

Moving forward, the role and impact of closed groups and messages shared on encrypted services such as WhatsApp will need to be better understood. These platforms provide users with valuable privacy and safety functions but can also be important channels to spread mis- and disinformation, while their private and encrypted nature make understanding content spread on these channels impossible to analyse (OECD, 2022[67]). Another challenge faced in supporting research in this space is that research tools, such as specialised software or application programming interfaces (API) used to facilitate content and data sharing between applications are often prohibitively expensive, particularly for smaller research groups with limited budgets. Access to data from social media platforms is also increasingly difficult to get.

In response to these challenges, the European Union Digital Services Act (DSA) partially addresses the issue of data availability for the researchers (as discussed further in Chapter II). Specifically, Article 40 of the DSA stipulates that, “providers of very large online platforms or of very large online search engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the (specified) requirements, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union” (European Union, 2022[6]).

A fundamental issue regarding research in this space is that there is often a disconnect between the research being conducted and the ability for governments to use evidence collected in policymaking and implementation. Researchers and governments have identified a shortage of efficient information exchange and co-operation formats between relevant actors at both the national and international level. To that end, the French government has supported the International Observatory on Information and Democracy, which is modelled on the Intergovernmental Panel on Climate Change (IPCC) to aggregate and synthesise existing research to better understand the information and communication space (see Box ‎3.16).

Ultimately, for decision makers, it can be difficult to turn the results of academic studies into practical policies, suggesting that the feedback loop between researchers and governments can be improved to determine what conditions within the information environment are beneficial for democracy and help measure the success of policy interventions (Wanless and Shapiro, 2022[4]).

Strengthening participation by and engagement with the public, civil society, and media workers will be essential as countries look to strengthen information integrity, reinforce democracy, and build trust. A whole-of-society approach, grounded in the protection and promotion of civic space, democracy, and human rights, will be necessary given the fundamental role that individuals and non-governmental partners have in promoting healthy information and democratic spaces.

Notably, citizens and stakeholders often have relevant and needed experience, human capital, and qualifications that can provide complementary perspective to governmental policymaking and to identify and respond to disinformation threats. Non-government actors may also have easier access to and greater experience working with groups that governments cannot reach as easily, for example, migrants, diasporas, and other minority, marginalised, or socially excluded groups who may be particularly affected by targeted disinformation. To the extent that non-governmental actors are seen as more reliable sources of trustworthy information than governmental institutions, the public may also be more receptive to projects and other initiatives managed by civil society organisations.

Governments are advancing steadily in this area, increasingly putting in place frameworks for successful engagement and partnership with the public and non-government partners, recognising that groups have different needs. As governments develop multi-stakeholder approaches, they should be guided by the following questions:

  • How can participatory initiatives that engage citizens and non-government stakeholders be best designed and carried out to build understanding of the information space and develop effective policy responses?

  • What are the benefits and potential drawbacks of partnerships and collaboration with non-government partners, including the private sector? How can any drawbacks or risks – to government and non-government partners – be mitigated?

  • How can governments best decide which initiatives to strengthen information integrity should be carried out in partnership with CSOs, media, academia, the private sector (not only online platforms) and where can – or should – governments act alone?

  • How can whole-of-society efforts designed to strengthen information integrity be measured to track their effectiveness and value?

To that end, governments should consider the following efforts to pursue a whole-of-society approach to strengthening societal resilience and citizen and stakeholder participation:

  • Enhance public understanding of – and skills to operate in – a free information space conducive to democratic engagement. Governments should ensure that civic, media, and digital information literacy, education and initiatives form part of a broader effort to build societal resilience and measure the effectiveness of initiatives. Promoting media and information literacy in school curricula from primary and secondary school to higher education, developing training programmes for teachers, conducting impact evaluations of media and information literacy programmes (including longitudinal studies), as well as supporting research to better understand the most vulnerable segments of the population to the risk of disinformation and to better target media and information programmes should form key pillars of governments’ toolbox.

  • Implement information access laws and open government standards, including publicly accessible open data, to lower barriers for journalists and citizens to access public information and officials.

  • Build capacity and work with partners from across society (notably academics, CSOs, media, and online platforms) to monitor and evaluate changes to and policy impacts on the information space. Beyond output measurements, methods for understanding the impact of disinformation and counter-disinformation efforts should also include monitoring changes in broad indicators over time, such as behavioural indicators and susceptibility to mis- and disinformation narratives.

  • Provide clear and transparent guidelines and oversight mechanisms for government engagement with other actors, to ensure that when governments are partnering with, funding, or otherwise co-ordinating with or supporting activities of non-government partners on issues related to information integrity governments cannot unduly influence the work of these actors or restrict freedom of expression. Unclear rules, exclusions, or decisions could create distrust in the process. Such guidelines and oversight mechanisms are particularly valuable in avoiding actual and perceived politicisation of governments’ engagement with non-government actors.

  • Build the capacity of the still largely underdeveloped public communication function to play a constructive role in supplying timely information and in raising awareness of threats, while developing a more solid governance for its own functioning, away from politicised information. In the short-term, the function can serve as an important source of information, including in times of crisis. Over the longer-term, building the capacity of the function to provide citizens with the skills necessary to better understand the information environment, for example through pre-bunking, can be an important tool for societal resilience.

  • Strengthen mechanisms to avoid real or suspected conflict of interest with respect to the public communication function. Transparent, accountable, and professional management of the public communication function can help ensure it plays an important role in providing timely information that can build awareness of relevant challenges and threats and provide proactive communication that helps build societal resilience to the spread of disinformation.

  • Expand understanding of the information space by supporting research activities to better understand trends in information and content consumption patterns, the threats posed and tactics used by foreign actors spreading false and misleading information, and methodologies for assessing the impact of risk mitigation measures. Strengthen opportunities and mechanisms for research to inform the policy-making process.

  • Design and put in place effective participatory mechanisms with citizens, journalists, social media platforms, academics, and civil society organisations to help establish policy priorities and clarify needs and opportunities related to strengthening information integrity. Building more meaningful democratic engagement, including through deliberative citizens assemblies, around policy design and implementation as related to information integrity will contribute to broader efforts to strengthen democracy resilience.

  • Identify government collaboration on information integrity with non-government partners, including journalists, academia, the private sector, and other relevant non-governmental organisations. Engagement activities and outputs, including those related to funding, the goals of the co-operation, and impact on content decisions, should be clearly identifiable by the public. Similarly, the public should be able to identify whether a communication campaign, media literacy activity, or research product is financed or guided by government institutions.

  • Take steps to clarify funding sources to mitigate the risks of malign interfering groups gaining access to data or being able to manipulate a country’s information space.

  • Mitigate the risk to governmental staff, academics, CSOs, private sector, and other actors engaged in information integrity initiatives when they become targets of disinformation campaigns, other threats, and harassment. When necessary, enable appropriate measures to protect the human rights of affected individuals.

References

[34] Basol, M. et al. (2021), “Towards psychological herd immunity: Cross-cultural evidence for two prebunking interventions against COVID-19 misinformation”, Big Data & Society, Vol. 8/1, https://doi.org/10.1177/20539517211013868.

[22] Be media smart (2023), “Be media smart website”, https://www.bemediasmart.ie/ (accessed on 15 February 2024).

[62] Citizens’ Assembly on Democratic Expression (2022), , https://static1.squarespace.com/static/5f8ee1ed6216f64197dc541b/t/632c7bdbe8994a793e6256d8/1663859740695/CitizensAssemblyOnDemocraticExpression-PPF-SEP2022-FINAL-REPORT-EN-1.pdf.

[35] CLEMI (2023), Bilan de formation 2021-2022, https://www.clemi.fr/fr/bilans-de-formation.html.

[18] CLEMI (n.d.), CLEMI website, Centre pour l’éducation aux médias et à l’information, https://www.clemi.fr/fr/qui-sommes-nous.html (accessed on 15 February 2024).

[36] Council of Europe (2016), Mapping of media literacy practices and actions in EU-28, https://rm.coe.int/media-literacy-mapping-report-en-final-pdf/1680783500.

[54] EFCSN (2022), “The European Fact-Checking Standards Network Project”, European Fact-Checking Standards Network, https://eufactcheckingproject.com/.

[65] European Comission (2023), “Funded projects in the fight against disinformation”, https://commission.europa.eu/strategy-and-policy/coronavirus-response/fighting-disinformation/funded-projects-fight-against-disinformation_en.

[12] European Commission (2023), Guidelines pursuant to Article 33a(3) of the Audiovisual Media Services Directive on the scope of Member States’ reports concerning measures for the promotion and development of media literacy skills, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52023XC0223%2801%29.

[41] European Commission (n.d.), “DigComp”, https://joint-research-centre.ec.europa.eu/digcomp_en.

[6] European Union (2022), Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), Publications Office of the European Union, https://eur-lex.europa.eu/legal-content/EN/TXT/?ur.

[7] European Union (2018), Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provis, Publications Office of the European Union, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32018L1808.

[52] Faktograf (2023), Harassment of Fact-checking Media Outlets in Europe, https://faktograf.hr/site/wp-content/uploads/2023/03/preliminary-survey-report-final.pdf.

[58] Forum on Information and Democracy (2023), OECD Tackling disinformation: Strengthening democracy through information integrity conference.

[14] Forum on Information and Democracy (2023), Pluralism of news and information in curation and indexing algorithms, https://informationdemocracy.org/wp-content/uploads/2023/08/Report-on-Pluralism-Forum-on-ID.pdf.

[55] Full Fact (2022), Full Fact Report 2022: Tackling online misinformation in an open society - what law and regulation should do, https://fullfact.org/media/uploads/full-fact-report-2022.pdf.

[64] Government of Canada (2023), Digital Citizen Initiative, https://www.canada.ca/en/canadian-heritage/services/online-disinformation.html.

[16] Government of Finland (2019), Media Literacy in Finland: National Media Education Policy, Ministry of Education and Culture, https://medialukutaitosuomessa.fi/mediaeducationpolicy.pdf.

[61] Government of Ireland (2022), Report of the Future of Media Commission, https://www.gov.ie/pdf/?file=https://assets.gov.ie/229731/2f2be30d-d987-40cd-9cfe-aaa885104bc1.pdf#page=null.

[5] Government of Netherlands (2022), Government-wide strategy for effectively tackling disinformation, https://www.government.nl/documents/parliamentary-documents/2022/12/23/government-wide-strategy-for-effectively-tackling-disinformation.

[47] Government of New Zealand (2020), Make summer unstoppable by hitting COVID-19 for six, https://www.beehive.govt.nz/release/make-summer-unstoppable-hitting-covid-19-six.

[15] Government of Portugal (2017), Resolução do Conselho de Ministros n.º 142/2023.

[46] Government of the UK (2022), Government Communcation Service Propriety Guidance, https://gcs.civilservice.gov.uk/publications/propriety-guidance/.

[25] Government of the UK (2022), “Help for vulnerable people to spot disinformation and boost online safety”, https://www.gov.uk/government/news/help-for-vulnerable-people-to-spot-disinformation-and-boost-online-safety.

[24] Guess, A., J. Nagler and J. Tucker (2019), “Less than you think: Prevalence and predictors of fake news dissemination on Facebook”, Science Advances, Vol. 5/1, https://doi.org/10.1126/sciadv.aau4586.

[11] Hill, J. (2022), “Policy responses to false and misleading digital content: A snapshot of children’s media literacy”, OECD Education Working Papers, No. 275, OECD Publishing, Paris, https://doi.org/10.1787/1104143e-en.

[53] IFCN (2023), “Commit to transparency — sign up for the International Fact-Checking Network’s code of principles”, International Fact-Checking Network, https://ifcncodeofprinciples.poynter.org/.

[45] Kalenský, J. and R. Osadchuk (2024), How Ukraine fights Russian disinformation: Beehive vs mammoth, https://www.hybridcoe.fi/wp-content/uploads/2024/01/20240124-Hybrid-CoE-Research-Report-11-How-UKR-fights-RUS-disinfo-WEB.pdf.

[30] Latvian State Security Service (n.d.), “Annual reports”, https://vdd.gov.lv/en/useful/annual-reports (accessed on 15 February 2024).

[50] Louis-Sidois, C. (2022), “Checking the French Fact-checkers”, SSRN Electronic Journal, https://doi.org/10.2139/ssrn.4030887.

[33] Maertens, R. et al. (2021), “Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments”, Journal of Experimental Psychology: Applied, Vol. 27/1, pp. 1-16, https://doi.org/10.1037/xap0000315.

[21] Media Literacy Ireland (n.d.), “What is Media Literacy Ireland?”, https://www.medialiteracyireland.ie/ (accessed on 15 February 2024).

[13] Media literacy now (2022), Media Literacy Policy Report 2022, https://medialiteracynow.org/policyreport/.

[23] Media Literacy Week (2023), “Media Literacy Week celebrates diversity in creating and developing a better media environment for all”, https://mediataitoviikko.fi/in-english/.

[40] Morris, K. (2023), Ofcom’s Toolkit for Evaluating Media Literacy Interventions, Media & Learning Association, https://media-and-learning.eu/type/featured-articles/ofcoms-toolkit-for-evaluating-media-literacy-interventions/.

[43] Neylan, J. et al. (2023), “How to “inoculate” against multimodal misinformation: A conceptual replication of Roozenbeek and van der Linden (2020)”, Scientific Reports, Vol. 13/1, https://doi.org/10.1038/s41598-023-43885-2.

[38] Norwegian Media Authority (2021), Critical Media Understanding in the Norwegian Population, https://www.medietilsynet.no/globalassets/publikasjoner/kritisk-medieforstaelse/211214-kmf_hovudrapport_med_engelsk_2021.pdf.

[20] Norwegian Media Authority (2021), Stop, think, check: How to expose fake news and misinformation, https://www.medietilsynet.no/english/stop-think-check-en/.

[48] OECD (2023), Drivers of Trust in Public Institutions in New Zealand, Building Trust in Public Institutions,, OECD Publishing, https://doi.org/10.1787/948accf8-en.

[28] OECD (2023), “Good practice principles for public communication responses to mis- and disinformation”, OECD Public Governance Policy Papers, No. 30, OECD Publishing, Paris, https://doi.org/10.1787/6d141b44-en.

[1] OECD (2023), “What is resilience and how to operationalise it?”, OECD, Paris, https://www.oecd.org/dac/conflict-fragility-resilience/risk-resilience.

[67] OECD (2022), Building Trust and Reinforcing Democracy: Preparing the Ground for Government Action, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/76972a4a-en.

[56] OECD (2022), Building Trust to Reinforce Democracy: Main Findings from the 2021 OECD Survey on Drivers of Trust in Public Institutions, Building Trust in Public Institutions, OECD Publishing, Paris, https://doi.org/10.1787/b407f99c-en.

[63] OECD (2022), “Misinformation and disinformation: An international effort using behavioural science to tackle the spread of misinformation”, OECD Public Governance Policy Papers, No. 21, OECD Publishing, Paris, https://doi.org/10.1787/b7709d4f-en.

[68] OECD (2022), OECD Guidelines for Citizen Participation Processes, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/f765caf6-en.

[3] OECD (2022), The Protection and Promotion of Civic Space: Strengthening Alignment with International Standards and Guidance, OECD Publishing, Paris, https://doi.org/10.1787/d234e975-en.

[10] OECD (2022), Trends Shaping Education 2022, OECD Publishing, Paris, https://doi.org/10.1787/6ae8771a-en.

[2] OECD (2021), 21st-Century Readers: Developing Literacy Skills in a Digital World, OECD Publishing, https://doi.org/10.1787/a83d84cb-en.

[44] OECD (2021), OECD Report on Public Communication: The Global Context and the Way Forward, OECD Publishing, Paris, https://doi.org/10.1787/22f8031c-en.

[60] OECD (2020), Innovative Citizen Participation and New Democratic Institutions: Catching the Deliberative Wave, OECD Publishing, Paris, https://doi.org/10.1787/339306da-en.

[57] OECD (2017), “Recommendation of the Council on Open Government”, OECD Legal Instruments, OECD/LEGAL/0438, OECD, Paris, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0438.

[59] OECD (2010), Recommendation of the Council on Principles for Transparency and Integrity in Lobbying, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0379.

[49] OECD (forthcoming), Unlocking public communication’s potential for stronger democracy and increased trust.

[39] Ofcom (2023), A toolkit for evaluating media literacy interventions, https://www.ofcom.org.uk/research-and-data/media-literacy-research/approach/evaluate/toolkit.

[8] Ofcom (2023), Making Sense of Media, https://www.ofcom.org.uk/research-and-data/media-literacy-research.

[51] Pew Research (2019), Republicans far more likely than Democrats to say fact-checkers tend to favor one side, https://www.pewresearch.org/short-reads/2019/06/27/republicans-far-more-likely-than-democrats-to-say-fact-checkers-tend-to-favor-one-side/.

[17] Portuguese Regulatory Authority for the Media (2023), Media Literacy in Portugal: 1st Report under No. 2 of Article 33.A of the Audiovisual Media Services Directive, https://www.erc.pt/en/reports/media-literacy/1st-report-under-n-2-of-article-33-a-of-the-audiovisual-media-services-directive-eu-/.

[31] Republic of Lithuania (2022), National Threat Assessment 2022, State Security Department (VSD)/Defence Intelligence and Security Service under the Ministry of National Defence (AOTD), https://www.vsd.lt/wp-content/uploads/2022/04/ANGL-el-_.pdf.

[26] Roozenbeek, J. and S. van der Linden (2021), Don’t Just Debunk, Prebunk: Inoculate Yourself Against Digital Misinformation, https://www.spsp.org/news-center/blog/roozenbeek-van-der-linden-resisting-digital-misinformation.

[42] Roozenbeek, J. and S. van der Linden (2020), “Breaking Harmony Square: A game that “inoculates” against political misinformation”, Harvard Kennedy School Misinformation Review, https://doi.org/10.37016/mr-2020-47.

[29] Supo (2022), “Supo Yearbook 2021: Finns must be prepared for influencing efforts from Russia during NATO debate”, SUPO Finnish Security and Intelligence Service, https://supo.fi/en/-/supo-yearbook-2021-finns-must-be-prepared-for-influencing-efforts-from-russia-during-nato-debate.

[32] Swedish Security Service (n.d.), “Sweden Security Police Yearbooks”, https://www.sakerhetspolisen.se/om-sakerhetspolisen/publikationer/sakerhetspolisens-arsberattelse.htm (accessed on 15 February 2024).

[19] The Dutch Media Literacy Network (n.d.), “About Dutch Media Literacy Network”, https://netwerkmediawijsheid.nl/over-ons/about-dutch-media-literacy-network/ (accessed on 15 February 2024).

[37] UK Department for Digital, Culture, Media & Sport (2021), Online media literacy strategy, https://www.gov.uk/government/publications/online-media-literacy-strategy.

[9] UNESCO (2023), Media and information literacy, United Nations Educational, Scientific and Cultural Organization, https://www.unesco.org/en/media-information-literacy#:~:text=Media%20and%20information%20literacy%20empowers,to%20information%2C%20and%20sustainable%20development.

[27] Van der Linden, S. (2023), Foolproof: Why we fall for Misinformation and How to Build Immunity, 4th Estate.

[66] VERA.AI (2023), Project Summary: Facts & Figures, https://www.veraai.eu/project-summary (accessed on 19 October 2023).

[4] Wanless, A. and J. Shapiro (2022), A CERN Model for Studying the Information Environment, https://carnegieendowment.org/2022/11/17/cern-model-for-studying-information-environment-pub-88408.

Notes

← 1. For more information, see: https://www.mk.gov.lv/lv/media/14255/download

← 2. For additional information, see: https://www.isdatechtzo.nl/

← 3. For additional information, see OECD (2022[68]), OECD Guidelines for Citizen Participation Processes.

Legal and rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2024

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.