USC Annenberg Online Journalism ReviewUSC

Collaborative Conundrum: Do Wikis Have a Place in the Newsroom?


Wikipedia has more than 340,000 articles, written by a sprawling online community. Researchers are testing its veracity, while plans proceed for fact-checking it formally. Can journalists trust Wikipedia, and can collaboration software such as wikis improve newsgathering?

Journalism relies on collaboration to build trust. If these words were simply typed on my computer and posted to this page, how much would you trust them? Most readers trust the journalism they read in newspapers, see on TV or check out online because they assume there are editors, fact-checkers and some unseen apparatus making sure everything is correct. And the sources journalists rely on also play a key role in determining authority.

Weblogs turned that notion on its head by gaining trust through a community of readers, commenters and links to source material. And now comes wide-open wikis, Web pages that are easily edited, changed or erased by a public or private group of people. At first blush, most journalists would consider this idea heretical to journalistic integrity. If anyone can change the page at any time, how can you trust it?

But consider the Wikipedia, a free online encyclopedia with hundreds of thousands of entries created by thousands of people since just January 2001. Originally, it was supposed to be a trusted encyclopedia called Nupedia written only by people with PhD's. Wikipedia was an adjunct project that eventually became the main event, a sprawling public site that covers everything from Bayesian probability to cultural imperialism -- with versions in dozens of languages.

For journalists enthralled by Wikipedia, there's still one drawback: lack of accountability. The Boston Globe's Hiawatha Bray recently spelled out the problem in a story on Wikipedia. "Old-school reference books hire expert scholars to write their articles, and employ skilled editors to check and double-check their work," Bray wrote. "Wikipedia's articles are written by anyone who fancies himself an expert."

And the Syracuse (NY) Post-Standard's Al Fasoldt went further, reporting that a librarian said not to use Wikipedia as a trusted source. "I don't care whether a source of information is up to date if it's not accountable," Fasoldt told me via e-mail. "I don't think wikis can be used for any kind of source. If a source is not accountable, it must not be used. The responsibility of a journalist is to be accountable, and that means his or her sources must be reliable."

Proponents of Wikipedia take less of a black-and-white view, noting that most of its content is reliable but your mileage may vary. Jimmy Wales, founder of the Wikipedia and director of the Wikimedia Foundation (the non-profit that runs the site), told me that the "average level of content is quite high, but our open editing process means that people need to be judicious and sensible."

In other words, a journalist might be able to trust some or even most of the content on Wikipedia, but double-checking information is a must.

Fact-checking projects

Not long after Fasoldt's column was passed around online with derision from most bloggers, Techdirt's Mike Masnick called for a test case: Insert a few errors into Wikipedia and find out how long they would last without being edited. Alex Halavais, assistant professor at SUNY Buffalo's School of Informatics, inserted 13 provably incorrect entries into Wikipedia, and within a few hours, they were caught and fixed.

Halavais admitted that his methodology wasn't the best, because when someone caught one of his errors, they could quickly revert his other changes because he wasn't a trusted source. Unlike a community site such as eBay, Wikipedia has no way for people to rate each other on trustworthiness, and there is no apparent sign of authorship in the text, giving equal weight to the brightest and the dimmest bulbs on the Net. Some consider that the ultimate in democratic creation, while others assume the worst.

Mark Pellegrini, one of Wikipedia's volunteer administrators, told me a feature that allows people to rate each other "has already been written, and is currently undergoing testing on the test Wikipedia, a sandbox Wiki that we maintain for new software features."

Another blogger, who writes Dispatches from the Frozen North, decided to add five erroneous changes to Wikipedia over the course of five days. Importantly, he only added details to less traveled entries, rather than major changes or new pages. None of his changes were caught by the sixth day, when he reverted all the changes back.

"I was disappointed that all my changes in Wikipedia went unchallenged," the pseudonymous blogger wrote. "Surely a week was plenty of time, especially since fresh changes tend to get more scrutiny than old ones. I have to conclude that it would be very easy for subtle mistakes to sneak into Wikipedia, and go a very long time without being corrected."

Indeed, as more sites such as Slashdot and BoingBoing have written about Wikipedia's reliability, more people have decided to try their hand at disinformation. Eventually, Halavais was exhorting people not to try these experiments, and instead sign up to get the "Recent Changes" so they could help the Wikpedians fact-check their open resource. While Halavais is rooting for Wikipedia to improve, he is realistic about its veracity.

"It's a little like a Roomba, the robotic vacuum cleaner that treads randomly around a room for some time, and covers each part several times," Halavais said via e-mail. "Since there is no way to know whether, or how many times, any given article has been verified, you always have a niggling worry that a given article might be the one that somehow slipped by unchecked by an expert."

This whole controversy could one day be a moot point. A group of researchers, at the behest of SocialText CEO Ross Mayfield, is planning a formal fact-checking project to test how reliable Wikipedia really is. Mayfield, whose company provides corporate wikis for collaboration, is trying to run the project outside of Wikipedia's control. When I contacted him for an interview, he quickly set up a wiki so we could have a private, collaborative interview.

"I think of a wiki page as more of a resource than a source, but they can serve as sources provided the form is understood by those who use it," Mayfield said. "A wiki page reflects the common understanding of a group and its attention at a point in time. The one absolute is that the quality of a wiki page is better than it was before and will improve in the future. By removing barriers to participation, wikis gain more diverse domain expertise and even minor contributions accrete value to the resource. Quality is subjective, but with enough attention and participation it's destined to improve over time at a very low cost."

Last year, IBM researchers created visual flow charts of Wikipedia entries to show how the work of numerous authors changed over time. Plus, University of Hong Kong assistant professor Andrew Lih created baseline metrics to determine whether an article in Wikipedia has been vetted by enough people. He found that more traffic to an entry gave it more reliability. "This tight feedback loop between reading and editing provides for very quick evolution of encyclopedic knowledge, providing a function that has been missing in the traditional media ecology," Lih concluded.

Perhaps most importantly, Wales told me the Wikipedia Foundation was going to start fact-checking content on its own, in the hopes of creating a verified edition that will make all the disclaimers and responses to criticism obsolete. Wales' goal is to create a high-quality print encylopedia in every language that could be given away to everyone on the planet -- especially to poor people in developing contries.

"We are currently moving forward with the creation of a 'stable' version in which every article has been reviewed by a trusted reviewer," Wales said, "and so within a year, I will be able to remove the limited disclaimer, because Wikipedia will be the best encyclopedia in the world, without exception."

Wikipedia cost the Foundation $100,000 to run this year, mainly for the computer hardware. The money comes entirely from donations. But Wales did not estimate how much a fact-checking operation might cost, and if he would continue to rely on donated time from experts.

Collaborative journalism

If Wikipedia has some value as a journalist's resource, what about the wikis themselves? Within news organizations, a password-protected wiki might be valuable for editors and reporters as a workspace for their story in progress. At the least, an internal wiki might help as a small repository of shared knowledge. Your story budget could be a wiki. Your staff covering sports might start a wiki of all their sources, and how good they are.

The BBC is already using wikis internally to help streamline their work, according to Wales. And SocialText's wiki software has been in use at, a Ziff Davis news site for gamers. Mayfield says even tried a group wiki to gather the news of Nintendo's new handheld gaming device.

Mayfield believes the rise of participatory journalism might lead to wikis that allow journalists to collaborate with sources and readers. "Consider if reporters shared their coveted notebooks openly," he said. "If they shared sources and copy didn't end on the newsroom floor. If bylines ebbed and flowed, and even contained those outside the newsroom. If the newsroom developed a group memory even after people moved on."

But again, reliability of information is the prime barrier to wiki acceptance. For instance, what if an online newspaper set up a wiki for people to add their accounts to a breaking news story in their vicinity, such as a hurricane's damage. You might have the power of thousands of perspectives, but no way of verifying the sources without the built-in community such as the one that polices Wikipedia changes.

"Most user-generated content isn't content, but conversation," Mayfield said. "Cultivating community is a decided practice. It boils down to the social contract you make with your readers-turned-writers. If they trust that their effort and words will be appropriated appropriately, while providing social incentives for participation, it can very well work. Of course, fostering this kind of trust means giving up control -- something I don't think most traditional media is ready for."

Elizabeth Lawley, assistant professor at the Rochester Institute of Technology's Department of Information Technology, believes in the collective power of wikis, but is not sure that the removal of distinct personalities will work in a journalistic setting.

"I believe it matters who said what, and when," Lawley wrote on Corante's Many 2 Many blog. "That context provides enormous 'metadata' for me personally. And the wiki explicitly strips that. I understand why, and I do recognize its benefits. But I'm still uncomfortable with it."

More likely, wikis will be accepted into the newsroom if they are for private collaboration among staffers. But even in this case, journalists might well be frustrated if they have to search through a revision history to find out just who changed their words of perceived wisdom.

(Note: USC Annenberg School for Communication is a customer of SocialText, and USC students might be employed in the Wikipedia fact-checking project.)

Related Links
Alex Halavais: The Isuzu Experiment
BoingBoing: Wikipedia proves its amazing self-healing powers
Boston Globe: One Great Source -- If You Can Trust It
Corante: History, Personality and Wikis
Corante: Wikipedia Reputation and the Wemedia Project
Dispatches from the Frozen North: How Authoritative is Wikipedia
IBM: history flow
Slashdot: Wikipedia != Authoritative?
SocialText: Ziff Davis Media Accelerates Project Cycles
Syracuse Post-Standard: Librarian: Don't use Wikipedia as source
Techdirt: Who Do You Trust, The Wiki Or The Reporter?
Wikimedia Foundation
Wikipedia as Participatory Journalism: Reliable Sources?
Wikipedia in other languages
Wikipedia:Replies to common objections

Al Fasoldt, technology columnist, Syracuse Post-Standard


Alex Halavais, assistant professor, SUNY Buffalo School of Informatics


Ross Mayfield, CEO, SocialText