Return To Top

The Wider Context of Conspiracy Belief, Misinformation and Disinformation in Schools

The Commission into Countering Online Conspiracy in Schools was created by Pears Foundation and Public First, with fourteen expert commissioners spanning the fields of education, civil society, academia and the media. The Commission’s work was supported by an advisory board made up of civil society leaders drawn from Pears Foundation’s grantees, with expertise from a range of perspectives including education, youth services and mental health.

The Commission’s purpose is to offer crucial and timely insight into the increasingly complex, contested and critical issues of conspiracy theories, misinformation and disinformation and their impact on children and in schools today. Online conspiracy taps into broad and evolving challenges that are changing the nature of society and of adolescence, but we wanted to produce something of practical use to schools and staff in schools who are struggling with the impact they are having in the classroom right now.

This report covers the Commission’s findings and recommendations from our original research, but our hope is it will form an ongoing project as our understanding of the issue, and of effective interventions, develop.

The Commission’s work is underpinned by four foundational ideas:

  • 1

    There is a need to act now to better understand how conspiracy theories, misinformation and disinformation have been manifesting online.
  • 2

    There is particular need to focus on the impact on young people in schools.
  • 3

    We should speak to the three groups of people with a real stake in the debate, namely young people, school staff and parents.
  • 4

    The research should be grounded in the existing evidence to ensure that the subsequent findings contribute to moving forward the debate and possible solutions.

Why is this a focus for research now?

Young people are growing up in a world where information is increasingly disputed, as are the overarching narratives that information sits within. The current geopolitical situation provides ongoing fuel for highly contested, often binary debates which people want to make sense of through compelling accounts, explanations that speak to the reality of their daily lives, and suggestions for future action that allow them to believe they have control over their own lives. We are living through significant and frequently polarised political upheaval in the UK, the US and Europe, which has only compounded these desires. This is occurring against a backdrop of low and falling trust in democratic institutions and politics more broadly.

Misinformation and disinformation have been a concern in many nations, including the UK, for some time. Since 2016, there has been a growing acknowledgement of the challenges that online content poses to a democratic populace. Politics is, by its very nature, an area of contestation, but few people want the perpetuation of falsehoods, or wish to see elections settled by visceral reactions to mistruths.

Sometimes the issues surrounding conspiracy belief are presented as relevant only to a minority of people and associated with extreme behaviour. While this may be one of the results of engaging with conspiracy theories, what is evident is how widely conspiracy content, as well as misinformation and disinformation, is spread online in many different and constantly evolving formats. This is not a marginal issue, but one that has an impact on ordinary people as they go about their daily lives. It is therefore a highly relevant research area, the ripple effects of which are experienced in our society every day.

Why focus on conspiracy theories, misinformation and disinformation?

This research considers belief in conspiracy theories to be the ‘canary in the coalmine’ regarding a wide range of worrying and harmful behaviours. While some of our findings suggest that young people do not consider conspiracy theories to be a significant concern to them, that is not to suggest that conspiracy theories do not have wide-ranging and perverse effects which must be tackled. 

We see conspiracy belief as a visible and measurable way of understanding a plethora of other issues and their interdependencies, whether those are behaviour, mental health, a sense of belonging in schools and communities, or an individual child’s vulnerability.

Moreover, this is an issue that could not have higher stakes for wider society. A rising tide of misinformation and disinformation is leading us towards an epistemic crisis that puts at risk the values that allow open, democratic societies to function and flourish. 

Why focus on schools?

Belief in conspiracy theories, misinformation and disinformation is in no way specific to young people, and there is ongoing research into the ways it manifests in many different demographics across the adult population. We draw upon this wider research in order to contextualise our own efforts and to understand how young people may be interacting with adults – parents, relatives, school staff – who may hold some of these beliefs.

Although the scope for research, understanding and action on the topic of conspiracy theories, misinformation and disinformation is almost boundless, this Commission has been firmly rooted in schools, the young people who attend them, and the parents and school staff who support them. This is for five key reasons:

  1. Growing and changing access to online material is having a profound impact on young people. According to Ofcom, some 91% of children have a smartphone by age 11, and social media platforms and online information sources are evolving more rapidly than ever.1
  2. There has been very little research focused on young people in this space. The existing evidence is set out in our literature review, and this Commission is underpinned by this knowledge.
  3. Schools are at the centre of this issue. School staff are the professionals with the greatest insight as to how conspiracy belief manifests in young people’s day-to-day lives. Teachers in numerous schools were already highlighting the challenges they face in addressing conspiracy belief with young people.
  4. Linked to this, as an almost universal service for young people, schools are the obvious (although not only) site of intervention when we consider how best to support, educate and protect young people from online harm.
  5. Policymaking has always struggled to adequately address the challenges young people are experiencing online and, therefore, safeguard them at a systematic level. More understanding of how policymaking can support schools and young people in this space is needed.

Definitions

Throughout the primary research and this report, we have used the following working definitions of each of the three key technical terms that we are investigating as part of this research.

  • A conspiracy theory is a proposed explanation of historical, current, or speculative events in terms of the significant causal agency of a relatively small group of persons – the conspirators – acting in secret.
  • Misinformation is incorrect, misleading, or false information that stems from error or misunderstanding.
  • Disinformation is misinformation that has been spread deliberately.

The changing political context

The fieldwork for this research spanned a period of significant political change across the UK. Labour’s 2024 General Election victory and subsequent transition of power from Rishi Sunak’s Conservative government to Keir Starmer’s Labour government has seen significant change in many policy areas, including those relevant to this Commission.

The 2023 Online Safety Act is intended to offer increased protections to children and adults online and has the ambition to make the UK ‘the safest place in the world to be online.’2 As it stands, there are critiques of the Act from various stakeholders: of particular relevance are concerns about its reliance on secondary legislation and non-statutory guidance, and its inability to address disinformation and misinformation online.3

Labour has made a commitment to strengthening online safety in a digital age and to build on the 2023 legislation. There will likely continue to be significant debate around the Online Safety Act in the near future, especially given the impact of rapidly spreading misinformation on recent public disorder.

The existing policy landscape

The Department for Education currently provides non-statutory guidance on teaching online safety in schools, which includes reference to ‘disinformation, misinformation, and hoaxes’, although not directly to conspiracy theories.4 The government also signposts and hosts Educate Against Hate resources, which aim to provide a platform to support teachers and leaders in addressing radicalism and extremism, both of which intersect with issues including conspiracy belief, misinformation and disinformation.5 Throughout this research, this guidance was framed by school staff as an inadequate resource given the current and changing needs of schools in this space.

Access to the online world and social media is also a focus of current policymaking. The impact of smartphones on classroom learning, school behaviour, and general wellbeing of young people has become a policy concern, with newly elected Labour MP Josh MacAlister proposing regulation of smartphones in a Private Member’s Bill and the writings of psychologist Jonathan Haidt influencing policymakers on both sides of the Atlantic.6

However, it is important to note that this is not a report about smartphones, nor about social media regulations or the actions of social media platforms, although those are important policy debates that have relevance to this work. These debates are already contested by highly engaged and informed actors, and a great deal of research is being conducted in these spaces. This is outside the scope of the Commission, which instead explores the impact downstream, throwing new light on what is really happening with misinformation, disinformation and conspiracy theories in England’s schools on a daily basis.

Trust in politics and democracy:

This research took place in a period of low trust in both politics and institutions. The 2024 British Social Attitudes survey found record levels of disaffection across the population: 45% of respondents now say they ‘almost never’ trust governments of any party to place the needs of the nation above the interests of their own political party, while 58% say they ‘almost never’ trust ‘politicians of any party in Britain to tell the truth when they are in a tight corner’.7 Both of these are record highs. Current democratic engagement is also poor, as demonstrated by the 59.7% turnout for the July 2024 General Election, the lowest since 2001.8 This low and decreasing trust in government and wider institutions creates a breeding ground for conspiracy theorists.

Further research by Public First, conducted among the population at large, shows that this high level of ‘anti-politics’ cannot simply be thought of as a single trait.9 Those who distrust politicians are not necessarily the same people who see no value in voting, or the same people who feel that the current political parties do not represent them. This core trait of distrust in politicians and political institutions, which we term ‘cynicism’, is widespread in the UK. For example, 72% of the public agreed with the statement “most political discussions are just talk with no real action”.

This is crucial context for our subsequent findings. Although this research has focused on the experiences of young people, it is not to suggest that young people have a unique problem with belief in conspiracy theories, information sourcing and levels of institutional trust; but rather that they are growing up as part of a society that is clearly struggling with this issue.​

The existing literature on conspiracy theories

Throughout the work of the Commission, it has sought to be grounded in the existing literature. A full literature review is included in Appendix 2, but a short overview of the most relevant material is set out below.

Defining Conspiracy Theories

Conspiracy theories are generally defined as explanations that attribute significant events or circumstances to the covert actions of a small group of people, often with malicious intent. This definition includes historical, current, and speculative events, allowing for a wide scope. Conspiracy theories differ from misinformation (inaccurate information spread unintentionally) and disinformation (deliberate falsehoods spread to mislead) by organising information to create narratives that emphasise coordinated, secretive, and malicious efforts by powerful groups. This explains why conspiracy theories are resistant to being disproven in the mainstream, as adherents view traditional knowledge sources as further evidence of elite collusion. These narratives are usually monocausal, providing a way of way of seeing and understanding the world in which the agency of a hidden agent is stressed.

How Conspiracy Theories Spread

Conspiracy theories spread through a combination of susceptibility, transmission mechanisms, and echo chambers:

  • Susceptibility: Individuals who feel disempowered or marginalised are more inclined to believe in conspiracy theories, especially during crises like the Covid-19 pandemic. Some literature suggests that lower education levels (which correlate with lower socioeconomic status) and having opposing beliefs to the current government are predictors of belief.
  • Mechanisms: Conspiracy theories predate the internet but have gained prominence in the popular imagination partly due to social media, which facilitates rapid dissemination and reinforces beliefs through algorithmic patterns. Such platforms can help amplify harmful conspiracy content by exposing susceptible users to polarising content.
  • Echo chambers: A small number of highly active users and influencers drive conspiracy content on social media. ‘Filter bubbles’ and ‘echo chambers’ foster engagement with conspiracies by repeatedly exposing users to similar content.
Prominent Conspiracy Theories in the UK

A key challenge in addressing conspiracy beliefs is their constantly evolving nature. Some prevalent theories in the UK currently include beliefs that Covid-19 was a government plot, the “great replacement theory” relating to immigration patterns and birth rates, and digital currency as a means of population control. Social issues such as gendered disinformation have also risen, with figures like Andrew Tate particularly gaining attention among disillusioned young men. Surveys suggest a societal perception that conspiracy theories are growing in prominence, though the actual increase in belief is harder to quantify. Vulnerable groups, such as less well-educated demographics (inevitably, closely linked to lower income groups), are more likely to find such theories plausible. The Antisemitism Policy Trust, Tell Mama, Full Fact, the Institute for Strategic Dialogue, and others have highlighted a number of these conspiracies in their guide for parliamentarians.10

Impact on Adolescents and Schools

Adolescents may be particularly susceptible to conspiracy theories due to high social media usage, developmental factors, and social isolation. Adolescence is a period during which critical thinking skills are still developing, and many young people turn to social media as their worldviews begin to take shape. However, research specifically exploring adolescent belief in conspiracy theories is limited. Teachers report that conspiracy theories are disruptive in school environments, finding them a form of disruption that teachers are uniquely unprepared to handle.

Addressing Conspiracy Theories in Educational Contexts

Schools face challenges in managing conspiracy theories and disinformation. Teachers, pastoral staff, and support staff encounter students who engage with such narratives, yet lack the necessary resources or training to counteract them effectively. Suggested approaches include ‘cognitive inoculation’ (the warning of impending misinformation, anticipatory rebuttal), providing fact-checking, and developing critical media literacy in students, but strategies can prove difficult to scale and implement reliably in a classroom setting. Additionally, strategies that extend beyond the classroom to address social and psychological factors influencing adolescent belief in conspiracy theories are considered vital for a more comprehensive response.

In summary, the rise of conspiracy theories, especially among vulnerable populations, highlights the need for targeted interventions across educational, social, and policy contexts to mitigate harmful effects and promote critical engagement with information, and the broader narratives within which this information sits.

Teachers, pastoral staff, and support staff encounter students who engage with such narratives, yet lack the necessary resources or training to counteract them effectively.

Appendix 1

Methodology

The Commission used both qualitative and quantitative research with students, school staff and parents to find out how each group understands the problem of conspiracy theories, misinformation and disinformation, and how they thought it should be tackled. Underpinning this research was a detailed literature review and a series of expert Evidence Sessions to ensure that the Commission built on the existing knowledge and expertise across the sector.

The work of the Commission was delivered across four phases from March to November 2024:

Figure 1 – Four phases of the Commission

Phase 1 – Existing Literature and Expert Evidence:

The aim of Phase 1 was to ensure the Commission built on and harnessed the existing evidence on conspiracy theories, misinformation, disinformation and how the challenges of each of these manifests in schools.

This literature review (available in full in Appendix 2) has underpinned the subsequent research of the Commission and this report. In addition, the Commission undertook to hear directly from a series of experts in three oral expert Evidence Sessions and a series of one-to-one interviews. These experts included:

Table 1 – experts who provided evidence to the Commission.

Phases 2, 3 and 4 took the same methodological approach, with three different groups: young people, school staff and parents.

Phase 2 – Young People

The young people who participated in this research were between the age of 11 and 18 and in secondary or post-16 education. Although we know that that conspiracy belief, misinformation and disinformation do not manifest solely in young people aged over 11, we chose to focus on this age group for two key reasons. First, the transition to secondary school is often associated with additional freedom and autonomy for young people, which may make them more vulnerable to online harm. Second, unlike very young children, young people over the age of 11 can consent (with parental approval) to being involved in polling and focus groups. Given the contentious nature of the topic, ensuring that young people could provide informed consent was important for the ethical running of this research.

We ran an anonymous, online survey targeting 2,349 young people in full-time education, aged 11 to 18, from 5-13 May 2024.

We undertook a series of ten focus groups with young people in different schools across England. These groups were undertaken between May and November 2024. Each group was undertaken in person at the students’ schools during the school day, facilitated by two trained Public First researchers.

All focus groups were undertaken in a semi-structured format, with key discussion questions and flexibility for the discussion to be led by young people. Student participants were identified through their schools, with a lead teacher supporting Public First facilitators to arrange the groups.

The groups were designed to target two different age groups of pupils: Year 9 pupils (KS3) and Year 12 pupils (KS5). Two of the groups were single-sex groups.

Table 2 – Pupil focus group demographics.

Phase 3 – School staff

Throughout the research carried out for this Commission, we have referred to ‘school staff’ rather than ‘teachers’. This was a deliberate choice to cast the net more widely across schools to include all staff who work in schools, rather than just teaching staff. Although some of the challenges identified in this research doubtless manifest specifically within a classroom setting and are dealt with by teachers, conspiracy belief, misinformation and disinformation do not respect the walls of the classroom, and we felt it important to include all staff working in schools. This included (but is not limited to) teaching assistants, catering staff, school site and maintenance teams, IT support staff, cleaning staff and cover supervisors. Anyone who worked regularly in a school was identified as a potential research participant. Where ‘teachers’ were identified as a particular subgroup, they have been described as such.

We ran an anonymous, online survey targeting 448 school staff at UK secondary schools and colleges on the week commencing 24 June.

We ran six focus groups with school staff. School staff focus groups were independently recruited for a mixture of roles, both teaching and non-teaching. They were also recruited to include mixed gender, ethnicity and geographic demographics. Participants were from state schools only.

All the school staff focus groups were run online, facilitated by a Public First researcher. They were undertaken in a semi-structured format, with key discussion questions and flexibility for the discussion. These focus groups ran between 20 June and 1 August 2024.

Table 3 – school staff focus group demographics.

In addition to our independently recruited groups, we undertook an additional focus group with teachers and support staff of pupils with Special Educational Needs and Disabilities (henceforth SEND) which was recruited through Public First’s existing networks. This was a group of seven participants, which included the Safeguarding Lead, Head of Sixth Form and Headteacher, as well as classroom teachers and support staff from across the secondary section of the school. The school takes pupils from 2 to 19 years with severe, profound and multiple learning disabilities, including autism. The focus group was undertaken online and used the same semi-structured format as the rest of the groups.

Phase 4 – Parents

The final phase of this research focused on parents of young people in full-time education aged 11-18.

We ran an anonymous, online survey which encompassed 2,009 parents of young people in full-time education aged 11 to 18 from 10-19 July 2024.

We ran six focus groups with parents of young people. These groups were independently recruited to include a mix of ages and genders, within specific geographic bounds. All parents had to have at least one secondary-aged child. We excluded parents who worked in the following fields to avoid bias: technology, digital media, social media, education. We recruited specifically for social class in these groups, in order to achieve a broad sample, reflective of wider society.

All the parent focus groups were run online and facilitated by a Public First researcher. They were undertaken in a semi-structured format, with key discussion questions and flexibility for the discussion. These focus groups ran on the week commencing 29th July.

Table 4 – parent focus group demographics.

Throughout this research, we refer to different socioeconomic groups, as defined by the National Readership Survey’s Social Grade system.11 Social Grade is a classification system based on occupation and is based on the main income earner in the household.

The classifications are:

% of Population as defined in the National Readership Survey January – December 2016 

Polling:

Public First is a member of the British Polling Council, and company partners of the Market Research Society. Public First adheres to the professional standards set out by these bodies, including our duty of transparency. Full polling tables for all three polls undertaken in this research are available on Public First’s website.12

As with all opinion polls, there is a margin of error in the answers, and the margin of error is greater when sample sizes are smaller (when there are cross-breaks of specific groups of people). For pupils and parents, the margin of error is +/-3%. For school staff, it is +/-4%. All polling numbers in this report should be read on this basis.

Safeguarding and Ethical Research:

Inherent within the Commission were significant ethical considerations that had to be considered and navigated in order to ensure that the research did not cause harm to any individual or groups.

Public First has broad experience of undertaking research with vulnerable groups and on politically sensitive and challenging issues, including with young people, individuals with special educational needs and minority groups.

The Commission, and its research partner Public First, are committed to the very highest standards of ethical conduct in our research, and we adhere to the professional standards set out by the Market Research Society, of which we are members.13Market Research Society (2024). Available: https://www.mrs.org.uk/standards/ethics/14 Specific attention was paid to ensuring that informed consent was given by all participants prior to taking part in the research, particularly the young people who took part, and their parents. Additional safeguarding measures were put in place with the schools who supported this project, and a named contact was provided for young people after each focus group so that any concerns could be addressed.

The identifying features of all participants have been anonymised to protect their privacy. This report uses so called ‘thick description’ – detailed observations of characteristics and contextto narrate and interpret what has been observed and discussed within a broader context, and provides analysis based upon the voices of participants. Their words remain unchanged.  

Public First’s goal with this research was to explore how conspiracy theories are understood. We did not conduct this research in order to find out how widely held beliefs are, or to test how believable different theories would be. Instead, our questions and research focus on the perceptions of conspiracy theories, the downstream effects, and how educators can best respond. At all stages of the research, when specific examples of conspiracy theories were discussed, the research team made explicitly clear that these were widely discredited or had no factual basis.

Appendix 2

Literature Review

This research focuses on conspiracy theories, disinformation and misinformation in schools. In order to ground the research in the existing evidence, this literature review sought to examine four key questions:

  1. How do we define conspiracy theories, and what are the distinctions between misinformation, disinformation, conspiracy thinking and conspiracy theories?
  2. How do conspiracy theories get transmitted?
  3. What are the biggest conspiracy theories in the UK today and is conspiracy belief increasing?
  4. What do conspiracy theories and disinformation look like in schools?

In doing so, this review aimed to ensure that the subsequent research undertaken by the Commission maximised the contribution to current knowledge and provides useful outputs for sector stakeholders, particularly schools, and policymakers.

1. How do we define conspiracy theories, and what are the distinctions between misinformation, disinformation, conspiracy thinking and conspiracy theories?

The distinctions between the terms outlined below are often ambiguous and subject to ongoing academic debate. Some of the tensions between the definitions are highlighted and interrogated in subsequent sections of this literature review.

A conspiracy theory is ‘a proposed explanation of some historical event (or events) in terms of the significant causal agency of a relatively small group of persons – the conspirators – acting in secret’ (Keeley 1999, 135). The specification that conspiracy theories address ‘historical events’ does not imply that they address past events alone: indeed, many of the most pernicious and detailed conspiracy theories, such as QAnon, attract large followings based on the outlandish predictions that they make about the future (Barkun 2018), as well as commentating on current affairs.

Under this definition, conspiracy theories have a number of qualities that make them extremely challenging for policymakers to address:

  • First, they are predicated on the existence of power imbalances within society and exist as a form of dissent (J. E. Uscinski 2017, 2). Attempts to quash them thus often amplifies adherents’ beliefs, particularly given that (as we will see below) believers in conspiracy theories often belong to marginalised groups.
  • Second, and springing from this, they resist rejection by traditional sources and hierarchies of knowledge (Kreko 2020, 245-246). Believers are inclined to see widely-accepted sources of authority as just another carefully-coordinated establishment effort to conceal the truth.
  • Finally, there is an implication that conspiracy theories see elites as self-interested agents coordinating to maintain their dominance. This is significant because it adds a layer of rationality to believers’ convictions: it is not unreasonable for members of disadvantaged groups to believe that elite groups are exploiting them to further their own ends (Douglas, et al. 2019, 9).

Conspiracy theories are also closely related to misinformation and disinformation. Misinformation is incorrect information that stems from error or misunderstanding (Linkov, Roslycky and Trump 2019, 189), such as a recent story that proliferated online of an earthquake being detected in the UAE that turned out to be vibrations from a construction site (Arab News 2020). Disinformation is the deliberate spreading of inaccurate, frequently malicious information designed to mislead those who encounter it (United Nations 2021). The term ‘fake news’ often covers both misinformation and disinformation, but in a US-UK context, politicians’ typical use of the term implies a definition closer to disinformation than misinformation (Dickson 2023). In sum:

  • A conspiracy theory is ‘a proposed explanation of some historical event (or events) in terms of the significant causal agency of a relatively small group of persons – the conspirators – acting in secret’.
  • Misinformation is incorrect information that stems from error or misunderstanding.
  • Misinformation can become disinformation when spread deliberately.
  • Not all disinformation is misinformation, as some disinformation is deliberately created or planted.
  • Both misinformation and disinformation can become part of conspiracy theories, but only if they are combined with a wider narrative involving deliberate, causative and coordinated intervention by a secretive powerful group.

As well as these terms, the terms ‘conspiracy mindset’ or ‘conspiracy thinking’ are also highly relevant. This refers to a loosely-defined tendency for someone to believe in one or more conspiracy theories (Douglas, et al. 2019). As we will see below, conspiracy mindset is often evident in demographics that feel disempowered or disenfranchised, including minoritised groups and those with lower socioeconomic status and education levels (Imhoff, et al. 2022).

Whilst clarity of definition is important, the aim of this research is to provide advice and guidance to policymakers, school staff and campaigners in their efforts to combat the harmful effects of deliberately misleading narratives, whether or not these narratives are closer to a conspiracy theory or disinformation in nature.

This is particularly important because these terms are often subject to ‘definitional slippage’, particularly in lay usage, and we believe that when stakeholders refer to the dangers of conspiracy theories, they refer to a set of dangers that also accrue to disinformation as well. Although the delineation between misinformation and conspiracy theories has greater clarity, this research will also consider misinformation where relevant, either contextually or due to definitional slippage amongst stakeholders.

2. How do conspiracy theories get transmitted?

We can divide conspiracy theory transmission into three factors: susceptibility, mechanism and agency.

Susceptibility

Susceptibility to conspiracy theories is generally a product of disempowerment. The focus of conspiracy theories on the alleged coordination of secretive and powerful groups means that feelings of powerlessness or vulnerability predict conspiracy belief (van Prooijen and Douglas 2018). This can be seen in relation to high-leverage crisis events: experiences such as the Covid-19 pandemic, the Russia-Ukraine war and 9/11 have all prompted conspiracy theorising (Farinelli 2021). The conspiracy theories associated with such crises can reflect disempowered groups’ desire to see ordered, clear causality within chaotic and disruptive moments.

There is some evidence that belief in conspiracy theories in the UK is correlated with lower income and education levels, a finding partly corroborated by survey findings overseas relating to socioeconomic class (Strong 2021), and widespread agreement that conspiracy beliefs are associated with feelings of ‘existential anxiety’ (Mao, Yang and Guo 2020) and being in political opposition (Imhoff, et al. 2022). In this sense, dismissing conspiracy belief as ‘irrational’ overlooks the psychological benefits that individuals receive from holding these positions (van Prooijen and Douglas, Belief in conspiracy theories: Basic principles of an emerging research domain 2018). This is particularly important given that teenagers experience a high degree of existential anxiety online (Robson 2023). These risk factors can help us understand the extent of conspiracy recognition and belief among teenagers in school settings.

Mechanism

Conspiracy theories have a long history which predates the digital age. Perhaps the oldest widespread example in England is the medieval blood libel slur, which claimed that Jews murder Christian children and use their blood in Passover baking (United States Holocaust Memorial Museum n.d.). Indeed, research has suggested that conspiracy theories have emerged at many points throughout history at times of crisis as people try to make sense of turbulent periods, such as how conspiracy theories around JFK’s assassination spread within society amid the Vietnam War and widespread social upheaval (van Prooijen and Douglas, Conspiracy theories as part of history: the role of societal crisis situations 2017). In recent years, however, research has highlighted the role of social media and the internet more generally in the transmission of conspiracy theories and disinformation. A UN study highlighted the presence of Holocaust-related conspiracy theories and disinformation on different online platforms (UNESCO 2022). A study by KCL found that a large majority of UK adults believed that social media has contributed to a rise in the spread of conspiracy theories, and that those who got their news primarily from social media were more likely to believe that the cost-of-living crisis was a government plot and to have encountered and/or believe a conspiracy theory about ‘fifteen-minute cities’ (Duffy and Dacombe 2023).

It is possible to track the extent of and connections between different conspiracy theories and aspects of disinformation across different social media platforms: ISD’s ‘Tangled Web’ project found that different social media platforms had different characteristics when it came to the proliferation of conspiracy theories, but that ‘Covid conspiracy theorist’ accounts formed the second-highest proportion of accounts that met their threshold and definition of activities adjacent to harmful conspiracies (Comerford, et al. 2023). Furthermore, this work also established that there are linguistic and network links between some prominent disinformation and conspiracy communities on social media, such as the ties between anti-LGBTQ activists, far-right influencers and anti-immigration nationalists, with Covid conspiracy theorists less closely tied to this set of agents.

Agency

It is important to note that small numbers of highly active social media users can exert a strong influence on a wider network of otherwise passive users. The ‘Tangled Web’ project found that the number of accounts meeting the threshold and definition of harmful conspiracy activity was below 100 across each of their categories for dangerous online content, and a separate study by the Centre for Countering Digital Hate found that the majority of harmful anti-vax content emanated from a ‘Disinformation Dozen’ of harmful accounts. Chief among these are ‘filter bubbles’, which are algorithms that push users towards content based on personal characteristics, and ‘echo chambers’, groups of peers all expressing similar opinions that tend towards the partisan or polarised (Cinelli, et al. 2022).

In all, then, the features and procedures of social media have the power to accelerate vulnerable individuals down paths that expose them to disinformation and/or conspiracy theories, but quantifying the extent to which social media actually causes an increase in belief in disinformation or conspiracy theories is challenging. One alternative perspective is that social media creates rapid cycles of peaking and troughing engagement with conspiracy theories, as some research has found to be the case with Covid-19 conspiracies (Erokhin, Yosipof and Komendantova 2022).

Finally, it is worth highlighting that tech is often framed solely as a problem in this field, but there are suggestions it could act as a solution too: LLM-based AI chatbots may be able to reduce people’s beliefs in conspiracy theories by providing brief, reasoned responses to conspiratorial assertions (Stokel-Walker 2024).

3. What are the biggest conspiracy theories in the UK today?

It is difficult to be certain of the extent of conspiracy theory belief in the UK partly for methodological reasons. Two problems stand out. First, surveys establishing conspiracy theory beliefs frequently beg the question of the ‘biggest’ conspiracy theories: by asking, for example, ‘do you believe Covid-19 was a government conspiracy’, they give named conspiracy theories more authority than it might otherwise have. Second, quantifying belief based on social media patterns is very difficult. Research often focuses on who is followed and what content they are producing, but as we have seen, many content producers peddle disinformation and conspiracy theories across a range of topics. It is not easy to gauge the extent to which those who follow conspiracy and disinformation content producers believe or engage with the full range of these producers’ output. Despite this, some patterns emerge regarding the most prominent conspiracy theories at work in the UK today.

Work by KCL and Savanta last year found widespread agreement with the truth of several conspiracy theories. These included: central banks using digital currencies to control populations’ spending habits; Covid-19 being a global conspiracy to force worldwide vaccination; the ‘great replacement theory’, namely that White Americans and Europeans are being replaced by non-White immigrants; the cost-of-living crisis as a plot to control the public; ‘fifteen-minute cities’, where all services are with a 15-minute walk of where people live, being an attempt by governments to restrict people’s personal freedom and keep them under surveillance; and the ‘Great Reset’ announced by the World Economic Forum during the pandemic being used to impose totalitarian world government (Duffy and Dacombe 2023). Although the survey design risks overstating the level of belief that exists across the population, the study does speak to several conspiracy theories having a notable place in the public imagination. It also echoes earlier findings from Ipsos MORI, who in 2021 found a high level of awareness of several conspiracy theories, albeit with a wide spread of plausibility levels (Strong 2021). This study also found that lower-income households and those with fewer qualifications were slightly more likely to consider conspiracy theories plausible, reinforcing the idea that conspiracy theories and disinformation prey on the vulnerable.

While these two studies suggest that conspiracy theories occupy an increasingly large part of modern discourse, evidence implying a large-scale rise in conspiracy theory belief in the UK is scant at best. One study of belief in conspiracy theories in the US found no systemic increase in conspiracism (Uscinski, et al. 2022); second, a study of Twitter usage during the pandemic found a short-lived peak in conspiracy-related tweeting as Covid cases peaked, in line with the idea that high-leverage events can prompt conspiracy thinking (Batzdorfer, et al. 2022); and a third study cautions against interpreting the rise in social media usage as a causal factor in burgeoning conspiracy theory belief (Enders, et al. 2023).

To some extent, the actual level of belief in conspiracy theories and disinformation is less pertinent than perceptions of rising beliefs. The Savanta study found that a majority of people in the UK believe that conspiracy theories are on the rise (Duffy and Dacombe 2023). It may well be that this rise in perceptions of belief is sufficient for such theories to gain visibility and prominence, and for their harmful effects to spread. This is in part due to the relationship between well-articulated conspiracy theories and less coherent disinformation articulated above. Less far-reaching disinformation theories may seem more ‘reasonable’ in comparison to conspiracy theories, thereby gaining legitimacy.

In a UK context, particular attention has been paid to gendered disinformation, such as the narratives propounded by Andrew Tate (Weale 2023). A UN arms-length body has identified that communities of often self-defined involuntarily celibate men (incels) are particularly susceptible to disinformation and conspiracy theories because of their feelings of powerlessness and their disenfranchisement from mainstream politics (Van de Weert, et al. 2021). Initiatives such as Everyone’s Invited have drawn attention to the challenges of addressing sexual violence and so called ‘rape culture’ in UK schools more generally; the spread of gendered disinformation through online channels has been identified as a compounding factor in this wider problem (Weale 2023). A recent small-scale study concluded that relationships and sex education (RSE) in the UK does not adequately equip young people with the media and digital literacy necessary to evaluate the disinformation that figures such as Tate propound (Haslop, et al. 2024).

It is clear, then, that the range of conspiracy theories and disinformation at work in the UK today is broad, interconnected and flexible, and that we have only a loose grasp on changing rates of belief in the general population. There is a particular dearth of literature on the extent of conspiracy belief among adolescents in the UK. What does seem clear, however, is that exposure to disinformation and conspiracy theories is widespread, and this poses a particular risk to some of the most vulnerable groups in British society.

4. What do conspiracy theories and disinformation look like in schools?

There is a particular dearth of literature on the extent of conspiracy belief among adolescents in the UK. However, there are several reasons why teenagers have characteristics which may make them more vulnerable to conspiracy theories and disinformation. First, young people use the internet (especially social media) to a greater extent than the population at large, and this usage may contribute heavily to their emerging worldviews (Jolley, et al. 2021). Second, adolescence is a key period of neurodevelopment, and young people who are still developing a full range of cognitive function for analysis and critical thinking may be more susceptible to conspiracy theories and disinformation (Byrne, et al. 2024). Finally, there may be a social factor behind adolescent belief in and attraction to conspiracy theories. It is possible that socially isolated teenagers may give voice to conspiracy theories in order to seek attention or status that otherwise eludes them (Dyrendal and Jolley 2020). One study suggests that conspiracy thinking is heightened in response to these factors, which create a ‘perfect storm’ as students join Year 10 (age 14) (Jolley, et al. 2021).

That being said, unlike studies of conspiracy theories in the adult population at large, the evidence base on conspiracy and disinformation belief among adolescents is very thin. A systematic literature review found that only six studies provided strong findings regarding conspiracy beliefs in adolescents, with a total of 1897 participants across these studies spread across a large age range (Byrne, et al. 2024). This lack of evidence is in part a reflection of limited appetite for, and means of measuring, the extent of conspiracy belief among adolescents. Although one study devised an ‘Adolescent Conspiracy Belief Scale’ (ACBQ) to provide a means of standardised and systematic measurement (Jolley, et al. 2021), the definitional problems of conspiracy theories and ethical risks associated with such a scale mean it has not had widespread uptake. It is, therefore, potentially more productive to think of conspiracy theories as a population-wide challenge in which young people are inevitably implicated and particularly at risk of exposure (Duffy and Dacombe 2023) rather than as a challenge specific to young people.

In school settings, teachers find conspiracy theories and disinformation to be disruptive and disturbing challenges that they are not well equipped to deal with. A UCL survey found that conspiracy theories were the extremist idea that teachers encounter most commonly, but that teachers do not feel confident tackling conspiracy theories compared to other forms of extremism (Taylor, et al. 2021), a feeling other studies echoed even while acknowledging the small-scale and often superficial level of interest shown by school pupils in conspiracy theories (Dyrendal and Jolley 2020).  

In order to address the threat that disinformation and conspiracy theories pose in school settings, some experts have advocated ‘prebunking’, a form of cognitive ‘inoculation’ that involves warning of impending misinformation, anticipatory rebuttal, and only then the sharing of the misinformation (Dyrendal and Jolley 2020). Others have endorsed the principle of allowing conspiracy theories and disinformation as little space and time as possible in school settings: although some teachers express a preference for allowing extreme views to be exposed by airing them out during classroom debate (Taylor, et al. 2021), this strategy risks legitimising and spreading conspiracy theories and disinformation that put young people at risk (Taylor, et al. 2021). This corroborates some of the difficulties that research has uncovered with media literacy training and fact-checking as strategies for countering disinformation. While both have shown promise in their ability to correct false beliefs, variations in approach and inconsistencies in implementation make it hard to scale media literacy programmes and fact-checking initiatives successfully (Bateman and Jackson 2024). Finally, it is important to note that this is a problem that extends beyond the classroom to the full range of school settings. Pastoral staff as well as curriculum-facing school staff come into contact with young people sharing conspiracy theories and disinformation (Taylor, et al. 2021), so it is important to consider strategies beyond teaching and learning approaches that can support dealing with conspiracy theories and disinformation in schools.

  1. See Children and parents: media use and attitudes report 2022, by Ofcom, 30th March 2022, available at: Children and parents: media use and attitudes report 2022
  2. Online Harms White Paper: Full Government Response to the consultation - CP 354
  3. The Online Safety Act and Misinformation: What you need to know – Full Fact; Is the Online Safety Act "fit for purpose"? | Media@LSE
  4. Department for Education (2023) ‘Teaching online safety in schools’. Available: https://www.gov.uk/government/publications/teaching-online-safety-in-schools/teaching-online-safety-in-schools
  5. Educate Against Hate - Prevent Radicalisation & Extremism
  6. Jonathan Haidt Wants You to Take Away Your Kid’s Phone by David Rennick for the New Yorker, April 20th 2024, [accessed 21st November 2024, at Jonathan Haidt Wants You to Take Away Your Kid’s Phone | The New Yorker]. For details of Josh MacAlister MP’s bill, see Protection of Children (Digital Safety and Data Protection) Bill - Parliamentary Bills - UK Parliament [accessed 18th December 2024]
  7. National Centre for Social Research, Trust and confidence in Britain's system of government at record low, 12th June 2024. Available: Trust and confidence in Britain’s system of government at record low | National Centre for Social Research (natcen.ac.uk)
  8. Georgina Sturge for House of Commons Library, 2024 General Election: Turnout, 5th September 2024. Available: 2024 general election: Turnout (parliament.uk)
  9. M. Arena. (2024). ‘Understanding anti-politics in the UK’ Public First. Available: https://pfdatablog.com/blog/understanding-anti-politics-in-the-uk
  10. See ‘Conspiracy Theories: A Guide for MPs and Candidates’ by CST, Full Fact, ISD, Tell Mama, Global Network on Extremism and Technology, and Arieh Kovler, available at: https://antisemitism.org.uk/wp-content/uploads/2024/05/Conspiracy-Theory-Guide.pdf
  11. National Readership Survey (2024). Accessed: https://nrs.co.uk/nrs-print/lifestyle-and-classification-data/social-grade/
  12. Public First (2024). Available: www.publicfirst.co.uk