Tracking Viral Misinformation – The New York Times
Two decades ago, Wikipedia came on the scene as a whimsical online project aimed at collecting and documenting all human knowledge and history in real time. Skeptics feared that much of the site would contain unreliable information and often pointed out errors.
But now the online encyclopedia is often cited as a place that, overall, helps to combat the spread of false and misleading information in other places.
Last week the Wikimedia Foundation, the Wikipedia oversight group, announced that Maryana Iskander, a social entrepreneur in South Africa who has worked for years in nonprofits fighting youth unemployment and women’s rights, will become its executive director in January.
We spoke to her about her vision for the group and how the organization works to prevent false and misleading information on its websites and on the internet.
Give us a feel for your direction and vision for Wikimedia, especially in such a tense information landscape and in this polarized world.
There are some core principles of Wikimedia projects, including Wikipedia, that I think are important starting points. It is an online encyclopedia. It doesn’t try to be anything else. It certainly doesn’t try to be a traditional social media platform in any way. It has a structure run by volunteer editors. And as you may know, the Foundation has no editorial control. This is a user-led community that we support and enable.
The lessons that we can learn from, not just from what we do but also from our further development and improvement, begin with this idea of radical transparency. Everything is quoted on Wikipedia. It is discussed on our discussion pages. Even when people have different points of view, these debates are public and transparent and in some cases really allow for the right back and forth. I think that’s the need in such a polarized society – you have to make room for the back and forth. But how do you make this transparent and ultimately lead to a better product and better information?
And the last thing I’m going to say is that this is a community of extremely humble and honest people. With a view to the future, how can we build on these attributes, what this platform can continue to offer society and enable free access to knowledge? How do we ensure that we reach the full diversity of humanity when it comes to who is invited, who is written about? How do we really ensure that our collective efforts reflect more of the global South, more women, and the diversity of human knowledge to better reflect reality?
How does Wikipedia fit into the widespread problem of disinformation on the Internet?
Many of the core attributes of this platform are very different from some of the traditional social media platforms. If you take misinformation about Covid, the Wikimedia Foundation has entered into a partnership with the World Health Organization. A group of volunteers have come together on what is known as WikiProject Medicine, which focuses on medical content and creates articles which are then very carefully monitored as these are the kind of topics to be aware of if there is any misinformation.
Another example is that in the run-up to the US elections, the foundation put together a task force that, in turn, tries to be very proactive. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there were only 33 reversions on the main page of the US election was an example of how to focus very much on key issues where misinformation poses real risks.
Another example that I think is really cool is a podcast called “The World According to Wikipedia”. And in one of the episodes there is a volunteer being interviewed and she really made it her business to be one of the main watchers of the climate pages.
We have technology that notifies these editors when changes are made to any of the pages so they can see what the changes are. If there is a risk of misinformation sneaking in, there is an option to temporarily block a page. Nobody wants to do that unless it’s absolutely necessary. The example of climate change is useful because the discussion pages behind it are massively debated. Our editor says, “Let’s have the debate. But I watch and watch this side carefully. “
A major debate currently taking place on these social media platforms is the subject of information censorship. There are people who claim that biased views take precedence on these platforms and more conservative views are being dismantled. If you are thinking of how to deal with these debates when you are at the top of Wikipedia, then how can you make judgments while this is happening in the background?
For me, what is inspiring about this organization and these communities is that there are pillars that were established on the first day of building Wikipedia. One of them is the idea of presenting information neutrally, and this neutrality requires understanding from all sides and all perspectives.
It is what I said earlier: The debates on discussion pages are on the side, but then come to a well-founded, documented, verifiable, citable conclusion of the articles. I think this is a core principle that, in turn, could potentially offer others something to learn from.
Since you come from a progressive women’s rights advocacy organization, you’ve put a lot of thought into misinformation turning your background into a weapon to say that it could affect your views about what’s allowed on Wikipedia ?
I would say two things. I would say that the really relevant aspects of my work in the past have been volunteer-led movements, which is probably a lot harder than others might think, and that I have played a really operational role in understanding how to build systems, build a culture and processes that I think will be relevant to an organization and range of communities that are trying to increase their size and reach.
The second thing I would say is that I have been on my own learning journey again and I invite you to join me on a learning journey. How I choose to be in the world is that we treat others in good faith and engage in respectful and civilized ways. That doesn’t mean that others will. But I think that we have to hold on to it as a claim and as a way to be the change that we want to see in the world as well.
I did a lot of research on Wikipedia when I was in college and some of my professors said, “You know, this is not a legitimate source.” But I still used it the whole time. I was wondering if you had thought about that!
I think now most of the professors admit they sneak in on Wikipedia to look for things too!
You know, we’re celebrating Wikipedia’s 20th anniversary this year. On the one hand, here was this thing that I think people made fun of and said it wasn’t going anywhere. And it is now rightly the most referenced source in all of human history. I can only tell you from my own conversations with scholars that the narrative surrounding Wikipedia’s sources and how Wikipedia is used has changed.