Using Wikipedia to Combat Misinformation

Share this article

Identifying issues, preventing crises, and managing crisis communication have become core components of most public relations practitioners’ work. Even new PR professionals are being asked to cover both defense and offense positions in the game of crisis comms.  

Organizations are brought to life by people, and organizations are sustained or eliminated by people. 

Purposefully or not, people cause errors and issues. That’s a reality for what can happen inside and outside of organizations. 

Stakeholders, publics, and unintended audiences – human or AI -- can negatively impact even the most beloved organization’s reputation by spreading misinformation. 

With everyone (human or bot) able to create and broadcast “news” at anytime from anywhere, the internet has made crisis comms work seem like a never-ending game of Whac-a-Mole. 

We’re not sure what problem will pop up next, but we’re continuously scanning the environment, knowing it’s just a matter of time before one does. 

Culturally, society has been dealing with all kinds of unprecedented problems since the start of this decade. Amid this context and intentional malinformation spreaders, our PRSA Code of Ethics to advocate for and share factual information seems a mission-driven imperative.

I interviewed Josh Greene, CEO of The Mather Group, LLC, an agency that helps large organizations solve their reputation issues with a specific focus on Wikipedia and its role in search and AI presence. We met at the Counselors Academy spring conference, and I asked Josh what we need to know to reduce reputational risk and promote factual information online.

Wikipedia appeared to grow in popularity, then decline in usage, and now it maintains its position with users. How should we currently view its role in our PR ecosystem?  

Wikipedia articles get millions of views and, along with Wikidata, consistently feed answers to search engines and AI platforms. In the PR ecosystem, it’s a foundational piece for improving visibility, discoverability and accuracy. Journalists often use Wikipedia as a starting point for research. 

What kind of work do you focus on with Wikipedia?  

We focus on improving and maintaining article accuracy — whether that’s keeping pages up to date, fixing errors, protecting neutrality, or a mix. These pages have an incredible influence across the digital world. By focusing on accuracy, we can provide the best service to our clients and improve the encyclopedia as a whole. 

How can PR pros “manage” Wikipedia for clients?  

To help your clients manage Wikipedia, the best thing to do is work on getting notable, significant coverage that can be used to source the brand stories on Wikipedia. We understand that this is easier said than done, but Wikipedia is a live document that relies entirely on third-party sourcing. So your role is critical to your client’s success there. 

Artificial intelligence is the pebble in our PR shoes at times. What should we know about AI in regard to research and sources?  

Most people are using AI to ask questions in order to receive information. AI will echo everything said about a brand or person across the web. This means that small, obscure sites can have a meaningful impact on a reputation in AI. Consistency in every platform, release and media mention is going to make all the difference. 

How can we disrupt any misinformation with more trusted sources or content? 

Address the issue in as many ways as possible. Add an AI-friendly FAQ to your page that provides accurate information, give Google feedback on inaccurate search results, and contact the source behind the misinformation, asking them to update whatever copy is spreading the problem. 

AI and search engines really like crowd-sourced domains (think Wikipedia and Reddit) so clarity in these sources is important. It’s also more straightforward to update Wikipedia than to get a Redditor to change their post. 

Share this article