Lulu Garcia Navarro
๐ค SpeakerAppearances Over Time
Podcast Appearances
Yeah, it's sort of like the publication that you cite gets cited by other reputable sources, that it issues corrections when it gets things wrong.
And Wikipedia also is famously open source.
It's decentralized and essentially it's run by thousands of volunteer editors.
You don't run Wikipedia, we should say.
How do those editors fix disputes when they don't agree on what facts to be included or on how something is written?
How do you negotiate those differences?
Yeah, and basically every page has what's called a talk tab where you can see the history of the discussions and the disputes, which relates to another principle of the site, which is transparency.
You can look at everything and see who did what and what their reasoning was.
What you're saying is supported actually by a study about Wikipedia that came out in the science journal Nature in 2019.
It's called The Wisdom of Polarized Crowds.
Perhaps counterintuitively, it says that politically contentious Wikipedia pages end up being of higher quality, meaning that they're more evidence-based, they have more consensus around them.
But I do want to ask about the times when consensus building isn't necessarily easy as it relates to specific topics on Wikipedia.
Some pages, they have actually restricted editing privileges.
So the Arab-Israeli conflict, climate change, abortion, unsurprising topics there.
Why are those restricted and why doesn't the wisdom of polarized crowds work for those subjects?
This brings me to some of the challenges.
Wikipedia, while it has created this very trustworthy system, it is under attack from a lot of different places.
And one of Wikipedia's sort of superpowers can also be seen as a vulnerability, right?
The fact that it is created by human editors.
And human editors can be threatened, even though they're supposed to be anonymous.