Crowd Sourcing Loses Steam

Illustration by David Plunkert

In the history of the web, last spring may figure as a tipping point. That's when Wikipedia, "the free encyclopedia that anyone can edit"—a site that grew from 100,000 articles in 2003 to more than 15 million today—began to falter as a social movement. Thousands of volunteer editors, the loyal Wikipedians who actually write, fact-check, and update all those articles, logged off—many for good. For the first time, more contributors appeared to be dropping out than joining up. Activity on the site has remained stagnant, according to a spokesperson for the Wikimedia Foundation, the nonprofit behind the site, and it's become "a really serious issue." So serious, in fact, that this fall Wikipedia will turn to something it has never needed before: recruiters.

There's no shortage of theories on why Wikipedia has stalled. One holds that the site is virtually complete. Another suggests that aggressive editors and a tangle of anti-vandalism rules have scared off casual users. But such explanations overlook a far deeper and enduring truth about human nature: most people simply don't want to work for free. They like the idea of the Web as a place where no one goes unheard and the contributions of millions of amateurs can change the world. But when they come home from a hard day at work and turn on their computer, it turns out many of them would rather watch funny videos of kittens or shop for cheap airfares than contribute to the greater good. Even the Internet is no match for sloth.

That's why Wikipedia's new recruiting push will not rely merely on highfalutin promises about pooled greatness and "the sum of all human knowledge." Instead, the organization is hoping to get students to write and edit entries as part of their coursework. The Wikimedia Foundation teamed up with eight professors at schools including George Washington and Princeton to integrate the once frowned-upon research tool into public-policy curricula. As part of the program, Wikipedia's "campus ambassadors" will lead in-class training sessions on how to edit the site and help start Wikipedia student groups.

Click here to view a list of people made popular by the web Mark Humphrey

Tech writers continue to tout social media as a transformative phenomenon in its infancy. That's certainly true for such sites as Facebook, which boasts more than 500 million active users, or Flickr, which hosts some 4 billion photos. YouTube also shows no sign of slowing down. But those sites offer clear benefits to users, including the ability to easily stay in touch with friends, indulge in a game of Mob Wars, share baby pictures, or watch videos of fashion models falling down, in exchange for their time and efforts.

Many other elements of the user-generated revolution, meanwhile, are beginning to look sluggish. The practice of crowd sourcing, in particular, worked because the early Web inspired a kind of collective fever, one that made the slog of writing encyclopedia entries feel new, cool, fun. But with three out of four American households online, contributions to the hive mind can seem a bit passé, and Web participation, well, boring—kind of like writing encyclopedia entries for free.

Evidence of this ennui is everywhere. Amateur blogs, the original embodiment of Web democracy, are showing signs of decline. While professional bloggers are "a rising class," according to Technorati, hobbyists are in retreat, and about 95 percent of blogs are launched and quickly abandoned. A recent Pew study found that blogging has withered as a pastime, with the number of 18- to 24-year-olds who identify themselves as bloggers declining by half between 2006 and 2009. A shift to Twitter—or microblogging, as it's called—partly accounts for these numbers. But while Twitter carries more than 50 million tweets per day, its army of keystrokers may not be as large as it seems. As many as 90 percent of tweets come from 10 percent of users, according to a 2009 Harvard study. The others are primarily "lurkers"—people who don't contribute but track the postings of others. Between 60 and 70 percent of people who sign up for the 140-character platform quit within a month, according to a recent Nielsen report.

Citizen journalism also has stabilized. Fewer than one in 10 Web users say they have created their own original news or opinion piece, according to Pew, and comment sections on blogs or mainstream media sites, which were supposed to turn the old one-way media model into a two-way street, are often too profane, hateful, or off-point to attract people. Only one in four Web users has left a comment—probably no more than wrote letters to the editor in decades past, says Brian Thornton, a University of North Florida professor who has studied the history of the letters page.

Naturally, as some energy goes out of the Web, sites that depend on enthusiastic free labor are scrambling to retain it. The task is made more difficult by the fact that the competition is steeper than ever. Michigan State University professor Cliff Lampe, who studies online communities, says that where there were once three or four sites that invited participation, there are now thousands or even millions. "You're taking a limited resource—people—and spreading it over a much wider set of opportunities," he says. "It changes the playing field."

The smart players are changing, too. Digg began as "the new New York Times," a digital front-page curated by users who "vote up" their favorite stories. The site quickly became one of the most popular destinations on the Web. But while Digg won readers, it struggled to sign up voters, according to a 2008 speech by its founder Kevin Rose. Now the site is changing format, relaunching (later this year) with a personalized home page that lets users connect with friends rather than just vote on the news. Consumer-review sites like Yelp, Amazon, and Epinions, which use an army of amateur critics to cover products and services, offer elaborate appreciation programs that reward their unpaid people and keep users engaged. Yelp has more than 40 "community managers" scattered around the world, who throw parties for prolific reviewers. (At one recent event for the "Elite Squad," for instance, the snacks included squid-ink risotto.) And comment-driven news and aggregation sites like Gawker and The Huffington Post, where part of the fun is reading what the peanut gallery has to say, have decided to show the peanut gallery more love: mostly in the form of badges, stars, and special privileges. Even YouTube has added inducements, giving users the chance to play at Carnegie Hall—with a music contest—and partnering with the Guggenheim Museum to help them show off their art.

So far it seems to be working. After Gawker introduced its Star system, which gave preference to the work of "Starred" commentators, participation on the comment boards rose to a new high. The Huffington Post, which offers its best users digital merit badges and special rights (like the ability to delete other people's posts), boasts the most active commenters of any news site. And Yelp says it has maintained a pace of a million new reviews every three months.

Such reward programs are only likely to grow more important, especially as the Web reaches into corners of the world where it never benefited from the frisson of a social movement. Last year, in parts of eastern Africa, Google launched the Kiswahili Wikipedia Challenge, an effort to grow the number of Swahili-language Wikipedia entries by tying them to the chance to win modems, cell phones, and a laptop. It worked. This wouldn't surprise Jeff Howe, the author of Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business. Back in 2006, he predicted that the winners in the social-media world would be "those that figure out a formula for making their users feel amply compensated." Prizes are a start. Can cash be far behind? Oh, right, then it would just be a job.