Navigating disinformation: pressures and opportunities in PR
Public relations faces new ethical challenges as AI accelerates disinformation. How can practitioners counter falsehoods while safeguarding trust?
Disinformation has long been a weapon in war, in strained international relations, in political debate and the interpretation of expert advice.
It has become a matter of special concern in recent years as fears of interference in domestic political arrangements have grown and international tensions have increased. It has also become easier to generate and disseminate disinformation using the new capabilities of artificial intelligence.
The idea of living in a disinformation space – interpreting the world and making decisions based on disinformation – has come to prominence in recent weeks in difficult diplomatic contacts.
This problem, though, has been recognised for several years.
An example emerged in a report prepared by the UK House of Commons Digital, Culture, Media and Sport Committee in 2019 (Disinformation and Fake News) which quoted Vladislav Surkov, a senior advisor to President Putin, in an article published in the Russian daily Nezavisimaya Gazeta, on 11 February 2019.
He said that “foreign politicians blame Russia for meddling in elections and referenda all over the planet.
In fact, it’s even more serious than that: Russia is meddling in their brains, and they don’t know what to do with their changed consciousness.” The suggestion is that disinformation efforts are directed to changing the way people consciously see the world.
In public relations practice, there has been – since the emergence of the practice – a strong interest in the way information is used and communicated. The use of information in practice very quickly raises ethical questions: How should information be used? In whose interests is information given or withheld? What are the obligations bearing on practitioners relating to the use of information?
Use and misuse
Early discussion of these questions led on to the development of codes of conduct for practice. These contained guidance – aspirations for practitioners to follow – on the way information should be used and passed through channels of communication.
As one example, the International Public Relations Association – through long internal debate – developed the Code of Venice in 1961. This, later revised in 2009 to allow for developments in digital communication, set out that members should “not engage in practice which tends to corrupt the integrity of any channel of communication.” Nor should they “intentionally disseminate false or misleading information.”
Public relations as a practice claims special expertise in content creation, storytelling and the development of narratives which might engage or persuade groups of people. In the past, the practice has been described as the engineering of consent or as perception management, where if perceptions change, behavioural change will follow.
It is quite clear that as a practice, public relations has important insights into the ways in which information can be used – or misused.
Public relations is also – and this is not so much explored – a practice involved in the ways in which people come to see reality. A continuing theme in the arguments about the aims of practice (practice veteran and one of the founders of the then Institute of Public Relations in UK, Tim Traverse-Healy, made it a central element in his credo for public relations) is that public relations must deal in the truth.
Countering and protecting
But social truth is socially arrived at – through interaction between people as they are presented with, consider, talk about and weigh information to arrive at their understanding of reality. This is a process well discussed in a book by Peter Berger and Thomas Luckmann in 1966, The Social Construction of Reality.
Contributors to this process include providers of information – among them educators, community leaders, politicians, public relations practitioners and now influencers through social media.
The process will be complicated by difficulties with information, as more recent experience has shown. The UK Government’s Communication Service (GCS) has published and updated counter-disinformation toolkits which make useful distinctions between:
• Misinformation, which is verifiably false information that is shared without an intent to mislead.
• Disinformation – verifiably false information shared with an intent to deceive and mislead and
• Malinformation, which deliberately misleads by twisting the meaning of truthful information.
The toolkit emphasises focussing on providing accurate and credible information to counter the effects of all three and on protecting against harm.
GCS’s counter-disinformation toolkits are invaluable guides to action to be taken in dealing with disinformation. They combine with work done by the service to understand the psychology underlying belief systems and how they might be influenced to provide clear guidance on mitigating the effects of disinformation.
Alex Aiken, former executive director of UK Government Communication and now communication advisor to the Ministry of Foreign Affairs with the Abu Dhabi Emirate, United Arab Emirates, writing in a post on LinkedIn in early March, described the toolkit as “a powerful tool to design a defence mechanism to disinformation, partly by identifying what is harmful disinformation that can damage reputation as opposed to harmless misinformed musings”.
Data pollution
The post cites an example of how a small investment in disinformation can be shown, through simulation of effects, cause huge financial losses. It describes how Fenimore Harper Communications simulated an AI powered disinformation campaign with the goal of causing a bank run.
Marcus Beard, the consultancy's founder and a disinformation and AI expert, showed how the purchase of social media for $150, creation of doppelgänger websites, hostile MEMEs, untruths infecting AI and the consequence sharing of this information could cause a $150m run on a financial institution by undermining confidence, causing panic and getting people to switch their funds online. The consultancy saw this exercise as showing that influence operations exploit cognitive biases. The consultancy highlighted the collapse of Silicon Valley Bank which they describe as the first “social media fuelled bank run” in 2023, losing $16bn in 10 days.
The post also goes on to quote from a presentation at the World Governments Summit in February 2025 made by Bob Pearson, chair of the Next Practices Group in Austin, Texas, who argued that “we have become too accepting of mis- and disinformation as a problem that exists with which we must live. That type of growing acceptance of a bad status quo is a signal that we must reframe the issue. Governments have done a good job of setting up environmental departments to reduce pollution of our air and waterways. How about our neurons? The term data pollution brings this to life. And what do we do about it? Well, we have a responsibility to improve the cognitive security of our citizens. We have a lot of work ahead of us.” Some of this work, he believes, will fall to communications leaders.
Vigilance and education
Alex Aiken draws out the conclusion that the antidote to disinformation is a mixture of government action, media vigilance, and public education.
The GCS toolkit sets out how practitioners can work to identify and act to counter disinformation. Vigilance and public education are part of this work, and government practitioners are well placed to attend to, and act to counter disinformation.
Practitioners are positioned to anticipate and prepare to deal with disinformation. They are at the boundaries between organisations, the employers, and clients they serve, and the groups with which employing organisations and clients interact as they pursue their interests.
The vantage point taken by practitioners provides oversight of organisations, clients, and the groups of importance to them. Part of the analysis they need to perform in their work is to understand how these groups see the world and their interests, and the information sources they draw on in doing this.
This analysis should bring to the surface the extent to which perceptions are developed or likely to be developed by misinformation, disinformation and malinformation. It also adds to the value of the advice provided by practitioners in improving the quality of decision-making and mitigating risk.
It will also be important, soon and because of political developments at national and international levels, to point out as clearly as possible where disinformation can be anticipated and the effects of disinformation in the public domain. With its expertise in the use of information and in presentation public relations should be at the forefront of efforts to call out misuse and misrepresentation of information.
- Learn more at the CIPR's upcoming Disinformation, misinformation and malinformation course online on 11 June.
Jon White is an independent consultant who specialises in management and organisation development, public relations, communications management and public affairs. Accredited by the CIPR, he is also a visiting professor at the University of Reading (Henley Business School) and honorary professor of journalism at Cardiff University.
Also by Jon White
Pitching: A barrier to good practice in consulting in public relations?