Two decades ago, Wikipedia came onto the scene as an original online project that aimed to bring together and document all human knowledge and history in real time. Skeptics were concerned that much of the site included unreliable information and frequently reported errors.
But now the online encyclopedia is often cited as a place that overall helps fight the spread of false and misleading information elsewhere.
Last week, the Wikimedia Foundation, the group that oversees Wikipedia, announcement that Maryana Iskander, a social entrepreneur in South Africa who has worked for years in nonprofits fighting youth unemployment and women’s rights, will become its chief executive in January.
We spoke with her about her vision for the group and how the organization is working to prevent false and misleading information on its sites and around the web.
Give us a sense of your direction and vision for Wikimedia, especially in such a busy information landscape and polarized world.
There are a few basics of Wikimedia projects, including Wikipedia, which I think are important starting points. It is an online encyclopedia. He’s not trying to be something else. It certainly isn’t trying to be a traditional social media platform in any way. It has a structure run by volunteer editors. And as you may know, the foundation has no editorial control. Much of it is a user-driven community, which we support and activate.
The lessons to be learned, not only from what we do, but from how we continue to iterate and improve, begin with this idea of radical transparency. Everything on Wikipedia is quoted. This is discussed on our talk pages. So even when people may have different points of view, these debates are public and transparent, and in some cases really allow the right kind of back-and-forth. I think that’s the need in such a polarized society – you have to make room for the back and forth. But how do you do this transparently and ultimately lead to a better product and better information?
And the last thing I will say, you know, is a community of extremely humble and honest people. As we look to the future, how can we take advantage of these attributes in terms of what this platform can continue to offer society and provide free access to knowledge? How do we make sure that we reach the full diversity of humanity in terms of who is invited to participate, who is written about? How do we really ensure that our collective efforts are more reflective of the southern hemisphere, more reflective of women, and more reflective of the diversity of human knowledge, to better reflect reality?
What do you think of Wikipedia’s place in the widespread problem of online disinformation?
Many basic attributes of this platform are quite different from some traditional social media platforms. If you take the disinformation around Covid, the Wikimedia Foundation has partnered with the World Health Organization. A group of volunteers came together around what used to be called WikiProject Medicine, which focuses on medical content and article creation that are then very carefully monitored, as these are the types of topics that you want to keep in mind. mind about disinformation.
Another example is that the foundation set up a task force before the US election, again, trying to be very proactive. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there were only 33 reversions on the US Elections main page was an example of how to be very focused on key topics where disinformation poses real risks.
Then another example that I find really cool is there’s a podcast called “The World According to Wikipedia”. And in one of the episodes, there is a volunteer who is interviewed, and she really made a point of being one of the main observers of the climate change pages.
We have technology that alerts these editors when changes are made to any of the pages so they can see what the changes are. If there is a risk that false information will infiltrate, it is possible to temporarily block a page. No one wants to do it unless it is absolutely necessary. The example of climate change is useful because the talk pages behind it have massive debate. Our editor says, “Let’s have the debate. But this is a page that I watch and watch carefully.
A big debate going on right now on these social media platforms is this issue of information censorship. There are people who claim that biased opinions take precedence on these platforms and more conservative opinions are suppressed. As you think about how to handle these debates once you’re at the helm of Wikipedia, how do you make judgments with what’s going on in the background?
For me, what is inspiring about this organization and these communities is that there are fundamental pillars that were established on the first day of Wikipedia’s creation. One of them is this idea of presenting information with a neutral point of view, and this neutrality requires understanding all sides and all perspectives.
This is what I said earlier: have the debates on the talk pages alongside, but then come to an informed, documented and verifiable conclusion on the articles. I think this is a basic principle that, again, could potentially offer something for others to learn.
Coming from a progressive organization fighting for women’s rights, have you thought a lot about disinformers who militarize your past to say that it can influence the calls you make about what is allowed on Wikipedia?
I would say two things. I would say that the really relevant aspects of the work I have done in the past are the volunteer-led movements, which is probably a lot more difficult than others might think, and that I played a really operational role. to understand how to build systems, create culture and create processes that I believe will be relevant to an organization and a set of communities trying to increase their scale and reach.
The second thing I would say is, again, that I have followed my own learning journey and I invite you to take a learning journey with me. The way I choose to be in the world is that we interact with others with a presumption of good faith and engage in a respectful and civilized manner. It doesn’t mean that other people are going to do it. But I think we have to hang on to it as an aspiration and as a way, you know, to be the change that we want to see in the world as well.
When I was in college, I did a lot of my research on Wikipedia, and some of my professors would say, “You know, that’s not a legitimate source. But I still used it all the time. I was wondering if you had any ideas on this!
I think now most professors admit that they also sneak onto Wikipedia to look for things!
You know, this year we are celebrating the 20th anniversary of Wikipedia. On the one hand, there was this thing that I think people were laughing at and saying wasn’t going anywhere. And it has now rightfully become the most referenced source in all of human history. I can tell you just from my own conversations with academics that the narrative around Wikipedia sources and Wikipedia usage has changed.