×

‘No amount of money can change our mission’: Jimmy Wales, co-founder of Wikipedia

Jimmy Wales discusses Wikipedia's core values, the vital role of its volunteer community, and why the platform is not for sale

Jimmy Wales | Getty Images

Interview/ Jimmy Wales, co-founder of Wikipedia

Jimmy Wales could have been a billionaire. Few people in history have had so certain a shot at that status and decided against taking it. But Wales had another target in sight and he took the shot, with a fair amount of accuracy. Wikipedia, which he co-founded 25 years ago, is one of the most visited websites in the world, with millions of people relying on it as a primary source of information. And he remains steadfast to its mission—to provide free, accurate knowledge to anyone, anywhere.

All major AI models are trained on Wikipedia, and one of the issues we face is that they are constantly crawling our servers. Ideally, we would like them to use our enterprise product and pay a fee, which would ease the load on our servers and also benefit them.

It has not been easy. And it has become tougher of late, thanks to misinformation campaigns, biased reporting and the rise of social media. What is at stake are not just facts but also the idea of trust. That might be the reason Wales is exploring the theme in his new book, The Seven Rules of Trust.

In an interview with THE WEEK, he discusses how Wikipedia ensures the health of its community, the tools needed to manage misinformation, and the vital role volunteers play in keeping Wikipedia neutral and reliable. He also reflects on the role of artificial intelligence, and how emerging technologies could be harnessed to support the community while maintaining the platform’s core values. Excerpts:

Q/ Many people trust Wikipedia, despite knowing that it can be edited by anyone. How do you explain this paradox?

I think the trust people have in Wikipedia comes from long-term experience of using it. Even though people may not fully understand how it works, they can see the results. You go to almost any topic in the world, no matter how obscure, and there is this great information available, with sources you can click through and check. One of the ways we have built trust, despite the obvious imperfections, is through Wikipedia’s transparency. You can see the sources and the material, read the debates, and review all past versions of the pages. But, of course, in order to continue earning that trust, we need to maintain our efforts and ensure that the quality of the content remains as high as possible.

Q/ One important theme in your new book was the idea of trust repair. What has been the most significant trust repair moment in Wikipedia’s history, and what did you learn from it?

I would say that early on, we had a serious error in a biography of a well-known journalist. He contacted me, and I had it fixed within 10 minutes. However, he then went on to write a scathing editorial in the USA Today, which led to a lot of bad press. It was not the kind of attention we wanted. What we did in response was to implement our BLP (Biography of Living Persons) policy, which states that any negative statement in the biography of a living person must be backed by a source. If there is no source, it should be removed. That was a significant change that came about because we knew we needed to be trustworthy.

Q/ The volunteer-driven model of Wikipedia is still somewhat unique. It is a great strength, but it is also a big challenge. What reforms and innovations do you think are needed to keep this community healthy and robust?

Thinking about community health, as we call it, is at the heart of everything we do. We hold local meetups and events worldwide to get to know people. We have our annual conferences where top volunteers discuss best practices and the core values of Wikipedia. We also have local chapters around the world, working on partnerships with galleries, libraries, archives, and museums. All of this is really important because Wikipedia thrives when we have a community committed to its core values: quality, neutrality, and integrity.

Of course, we must also ensure that volunteers have the necessary tools to block trolls and keep the environment under control so that it is not overrun by misinformation. So, it is not really about reform but about continuing to be vigilant and doing what needs to be done.

Miles to go: Jimmy Wales at Wikimania, the Wikimedia movement’s annual conference, held at Harvard Law School in 2006 | Getty Images

Q/ All artificial intelligence models train on Wikipedia. How would you describe your relationship with them?

It is an interesting situation. All major AI models are trained on Wikipedia, and one of the issues we face is that they are constantly crawling our servers. Ideally, we would like them to use our enterprise product and pay a fee, which would ease the load on our servers and also benefit them. That is something we are working on. Google is a great customer, but we would like others to become customers, too.

One of the interesting things about large language models is that when people first encounter them, they find them amazing. But the more you use them, the more you realise the hallucination problem is quite significant. This is especially true for obscure topics, and Wikipedia is full of such topics. So, the technology is not yet capable of writing Wikipedia articles to our standards.

That said, we do see potential for AI to assist the community, perhaps in tasks like vetting, reviewing, or making suggestions. In those contexts, it does not need to be perfect—it just needs to be useful. So, there are definitely some exciting opportunities.

Q/ Misinformation is a big challenge now. In such a polarised information environment, how challenging is it to maintain neutrality and reliability?

It has always been a challenge. You have to be thoughtful and mindful of it. But, as it turns out, there is still a lot of really great journalism out there, and scientific research is moving forward as strongly as ever. I think a lot of the noise we hear comes from social media. The rest of the world is not as polluted as some might think. Sure, we have seen a rise in partisan media, which is problematic, but even then, you can work with that. You can always make sure you have seen both sides of the story and approach things with a balanced view. It is always a challenge, but you always have to take it seriously.

Man on a mission: Jimmy Wales (right) receiving the Quadriga Award 2008 in the “Mission of Enlightenment” category from author David Weinberger at the Komische Oper opera house in Berlin | Getty Images

Q/ Does Wikipedia have systemic biases? Your co-founder has said it has a left-liberal bias.

It is hard to say. Certainly, as human beings living in a particular era, I am sure there are areas where we are getting things wrong. Whenever we hear criticisms of bias, we need to take them seriously. We should ask ourselves: What are we missing? What should we add? What is going on? Because if we just dismiss concerns, that is not the right attitude. We need to be open and encourage people to help us improve.

In many areas of Wikipedia, there is a lot of work to be done—some of it around bias, and some around simply completing the work and adding more information. But the idea that Wikipedia has been taken over by “woke radicals” is just not true. That does not describe what is going on at all.

That said, it is true that Wikipedia often reflects mainstream media, and mainstream media does have biases. We all need to grapple with that and ask: What voices in society are we not hearing? What ideas have been neglected or unfairly treated? Wikipedia is part of that larger discourse, and I think it is healthy for society.

Q/ How sustainable is the current funding model for Wikipedia in the long term?

We think it is sustainable. We have been doing this for almost 25 years, and we are coming up to the 25th anniversary of Wikipedia. The vast majority of our funding comes from small donors, which is something we are very proud of. This means we are not dependent on government funding or a few billionaires who might introduce bias or their own agendas. As long as people continue to love Wikipedia and are willing to support us, we will be fine.

We run the Wikimedia Foundation in a financially conservative manner, always trying to be careful with money and build up reserves each year. One of the chapters in the book discusses the virtue of independence, which is crucial to us. Having a stable financial footing means we can resist external pressures. For example, when a journalist from The New York Post tweeted at Elon Musk, suggesting that he buy Wikipedia, I just tweeted back: “Not for sale.” No amount of money can change our mission.

Q/ In a developing country like India, Wikipedia plays an important role in knowledge access. How do you recognise this?

This is really important to us. We are excited about growing the community in India and having more editors there. It is vital for us to ensure that people have access to free knowledge, especially in developing countries where information can be more difficult to access.

Q/ If you were to start Wikipedia from scratch now, in this era of social media and AI, what would you do differently?

I don’t think I would do anything differently. I am actually quite happy with how Wikipedia has turned out. The model works. While social media is interesting, it does not really affect Wikipedia directly. AI might provide us with some interesting tools, but it does not change the values or goals of Wikipedia. So I would not change much.

Q/ You emphasise transparency as a building block of trust. What does transparency look like for a large online platform? Where do you think today’s tech giants fall short?

One of the problems with existing online platforms, especially social media, is that moderation decisions are often not transparent. It is difficult to understand why certain content is removed or why specific accounts are banned. This lack of transparency fuels concerns about bias and discrimination against certain groups or ideas.

I would like to see more transparency in how these decisions are made. A radical approach could be to give more control to the community rather than the company. One thing I like about Twitter these days is the community notes feature, where people can fact-check and link sources. However, I would like to see more transparency about how it works behind the scenes and how they reach their conclusions.


Allowing users to rate and fact-check content is a good thing.

Q/ People are increasingly relying on algorithms they don’t understand at all, including chatbots. How do you see these rules applying to AI systems?

I think they do apply very well. AI companies need to take trust seriously if they want long-term success.

THE SEVEN RULES OF TRUST: WHY IT IS TODAY’S MOST ESSENTIAL SUPERPOWER

By Jimmy Wales

Published by Bloomsbury

Price Rs699 Pages 352

TAGS