Wikipedia celebrates its 25th anniversary this month as the internet’s most reliable knowledge source. Yet behind the celebrations, a troubling pattern has developed: the volunteer community that built this encyclopedia has lately rejected a key innovation designed to serve readers. The same institution founded on the principle of easy and open community collaboration could now be proving unmovable—trapped between the need to adapt and an institutional resistance to change.
Wikipedia’s Digital Sclerosis
Political economist Elinor Ostrom won the 2009 Nobel Prize in economics for studying the ways communities successfully manage shared resources—the “commons.” Wikipedia’s two founders (Jimmy Wales and Larry Sanger) established the internet’s open-source encyclopedia 25 years ago on principles of the commons: its volunteer editors create and enforce policies, resolve disputes, and shape the encyclopedia’s direction.
But building around the commons contains a trade-off, Ostrom’s work found. Communities that make collective decisions tend to develop strong institutional identities. And those identities sometimes spawn reflexively conservative impulses.
Giving users agency over Wikipedia’s rules, as I’ve discovered in some of my own studies of Wikipedia, can lead an institution away ultimately from the needs of those the institution serves.
Wikipedia’s editors have built the largest collaborative knowledge project in human history. But the governance these editors exercise increasingly resists new generations of innovation.
Paradoxically, Wikipedia’s revolutionarily collaborative structure once put it at the vanguard of innovation on the open internet. But now that same structure may be failing newer generations of readers.
Does Wikipedia’s Format Belong to Readers or Editors?
There’s a generational disconnect today at the heart of Wikipedia’s current struggles. The encyclopedia’s format remains wedded to the information-dense, text-heavy style of Encyclopaedia Britannica—the very model Wikipedia was designed to replace.
A Britannica replacement made sense in 2001. One-quarter of a century ago, the average internet user was older and accustomed to reading long-form content.
However, teens and twentysomethings today are of a very different demographic and have markedly different media consumption habits compared to Wikipedia’s forebears. Gen Z and Gen Alpha readers are accustomed to TikTok, YouTube, and mobile-first visual media. Their impatience for Wikipedia’s impenetrable walls of text, as any parent of kids of this age knows, arguably threatens the future of internet’s collaborative knowledge clearinghouse.
The Wikimedia Foundation knows this, too. Research has shown that many readers today greatly value quick overviews of any article, before the reader considers whether to dive into the article’s full text.
So last June, the Foundation launched a modest experiment they called “Simple Article Summaries.” The summaries consisted of AI-generated, simplified text at the top of complex articles. Summaries were clearly labeled as machine-generated and unverified, and they were available only to mobile users who opted in.
Even after all these precautions, however, the volunteer editor community barely gave the experiment time to begin. Editors shut down Simple Article Summaries within a day of its launch.
The response was fierce. Editors called the experiment a “ghastly idea” and warned of “immediate and irreversible harm” to Wikipedia’s credibility.
Comments in the village pump (a community discussion page) ranged from blunt (“Yuck“) to alarmed, with contributors raising legitimate concerns about AI hallucinations and the erosion of editorial oversight.
Revisiting Wikipedia’s Past Helps Reveal Its Future
Last year’s Simple Summaries storm, and sudden silencing, should be considered in light of historical context. Consider three other flashpoints from Wikipedia’s past:
In 2013, the Foundation launched VisualEditor—a “what you see is what you get” interface meant to make editing easier—as the default for all newcomers. However, the interface often crashed, broke articles, and was so slow that experienced editors fled. After protests erupted, a Wikipedia administrator overrode the Foundation’s rollout, returning VisualEditor to an opt-in feature.
The following year brought Media Viewer, which changed how images displayed. The community voted to disable it. Then, when an administrator implemented that consensus, a Foundation executive reversed the change and threatened to revoke the admin’s privileges. On the German Wikipedia, the Foundation deployed a new “superprotect” user right to prevent the community from turning Media Viewer off.
Even proposals that technically won majority support met resistance. In 2011, the Foundation held a referendum on an image filter that would let readers voluntarily hide graphic content. Despite 56 percent support, the feature was shelved after the German Wikipedia community voted 86 percent against it.
These three controversies from Wikipedia’s past reveals how genuine conversations can achieve—after disagreements and controversy—compromise and evolution of Wikipedia’s features and formats. Reflexive vetoes of new experiments, as the Simple Summaries spat highlighted last summer, is not genuine conversation.
Supplementing Wikipedia’s Encyclopedia Britannica-style format with a small component that contains AI summaries is not a simple problem with a cut-and-dry answer. Though neither were VisualEditor or Media Viewer.
Why did 2025’s Wikipedia crisis result in immediate clampdown, whereas its internal crises between 2011-’14 found more community-based debates involving discussions and plebiscites? Is Wikipedia’s global readership today witnessing the first signs of a dangerous generation gap ?
Wikipedia Needs to Air Its Sustainability Crisis
A still deeper crisis haunts the online encyclopedia: the sustainability of unpaid labor. Wikipedia was built by volunteers who found meaning in collective knowledge creation. That model worked brilliantly when a generation of internet enthusiasts had time, energy, and idealism to spare. But the volunteer base is aging. A 2010 study found the average Wikipedia contributor was in their mid-20s; today, many of those same editors are now in their forties or fifties.
Meanwhile, the tech industry has discovered how to extract billions in value from their work. AI companies train their large language models on Wikipedia’s corpus. The Wikimedia Foundation recently noted it remains one of the highest-quality datasets in the world for AI development. Research confirms that when developers try to omit Wikipedia from training data, their models produce answers that are less accurate, less diverse, and less verifiable.
The irony is stark. AI systems deliver answers derived from Wikipedia without sending users back to the source. Google’s AI Overviews, ChatGPT, and countless other tools have learned from Wikipedia’s volunteer-created content—then present that knowledge in ways that break the virtuous cycle Wikipedia depends on. Fewer readers visit the encyclopedia directly. Fewer visitors become editors. Fewer users donate. The pipeline that sustained Wikipedia for a quarter century is breaking down.
What Does Wikipedia’s Next 25 Years Look Like?
The Simple Summaries situation arguably risks making the encyclopedia increasingly irrelevant to younger generations of readers. And they’ll be relying on Wikipedia’s information commons for the longest timeframe of any cohort now editing or reading it.
On the other hand, a larger mandate does of course remain at Wikipedia to serve as stewards of the information commons. And wrongly implementing Simple Summaries could fail this ambitious objective. Which would be terrible, too.
All of which, frankly, are what open discussions and sometimes-messy referenda are all about: Not just sudden shutdowns.
Meanwhile, AI systems should credit Wikipedia when drawing on its content, maintaining the transparency that builds public trust. Companies profiting from Wikipedia’s corpus should pay for access through legitimate channels like Wikimedia Enterprise, rather than scraping servers or relying on data dumps that strain infrastructure without contributing to maintenance.
Perhaps as the AI marketplace matures, there could be room for new large language models trained exclusively on trustworthy Wikimedia data—transparent, verifiable, and free from the pollution of synthetic AI-generated content. Perhaps, too, Creative Commons licenses need updating to account for AI-era realities.
Perhaps Wikipedia itself needs new modalities for creating and sharing knowledge—ones that preserve editorial rigor while meeting audiences where they are.
Wikipedia has survived edit wars, vandalism campaigns, and countless predictions of its demise. It has patiently outlived the skeptics who dismissed it as unreliable. It has proven that strangers can collaborate to build something remarkable.
But Wikipedia cannot survive by refusing to change. Ostrom’s Nobel prize-winning research reminds us that the communities that govern shared resources often grow conservative over time.
For anyone who cares about the future of reliable information online, Wikipedia’s 25th anniversary is not just a celebration. It is an urgent warning about what happens when the institutions we depend on cannot adapt to the people they are meant to serve.
Dariusz Jemielniak is Vice President of the Polish Academy of Sciences, a Full Professor at Kozminski University in Warsaw, and a faculty associate at the Berkman Klein Center for Internet and Society at Harvard University. He served for a decade on the Wikimedia Foundation Board of Trustees and is the author of Common Knowledge? An Ethnography of Wikipedia (Stanford University Press).
From Your Site Articles
Related Articles Around the Web