29 C
Miami
Wednesday, October 29, 2025

Elon Musk’s ‘Grokipedia’ Is Certainly No Wikipedia

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img


Wikipedia is a treasured online resource that, despite massive changes across the web, has managed to remained truly great to this day. I, alongside millions of other users, visit the site daily to learn something new or double-check existing knowledge. In an age of non-stop AI slop, Wikipedia is something of an antidote.

If you look at Wikipedia and think “this is alright, but an AI version would be a lot better,” you might just be Elon Musk. Musk’s AI company, xAI, just launched Grokipedia—yes, that really is its name—an online encyclopedia that closely resembles Wikipedia in name and surface-level appearance. But under the hood, the two could hardly be any more different. Though it’s early days for the new “encyclopedia,” I’d say it’s not worth using, at least not for anything real.

The Grokipedia experience

When you load up the Grokipedia website, it looks fairly standard. You see the Grokipedia name, alongside the version number (v0.1, at the time of writing), alongside a search bar and an “Articles Available” counter (885,279). Searching for an article too is basic: You type in a query, and a list of available articles appears for you to select from. Once you pull up an article, it looks like Wikipedia, only extremely basic: There are no images, only text, though you can use the sidebar to jump between sections of the article. You’ll also find sources, noted by numbers, which correspond to the References portion at the bottom of each article.

The key difference between Grokipedia and a simple version of Wikipedia, however, is that these articles are not written and edited by real people. Instead, each article is generated and “fact-checked” by Grok, xAI’s large language model (LLM). LLMs are able to generate large amounts of text in short periods of time, and include sources for where they pull their information, which might make the pitch for Grokipedia sound great to some. However, LLMs also have a tendency to hallucinate, or, in other words, make things up. Sometimes, the sources the AI is pulling from are unreliable or facetious; other times, the AI takes it upon itself to “lie,” and generate text that simply isn’t true. In both cases, the information cannot be trusted, especially not at face value, which is why it’s troubling to see much of the experience is entirely powered by Grok, without human intervention.

Grokipedia vs. Wikipedia

Musk is pitching Grokipedia as a “massive improvement” over Wikipedia, which he has criticized for pushing propaganda, particularly towards left-leaning ideas and politics. It’s ironic, then, that some of these Grokipedia entries are themselves pulling from Wikipedia. As The Verge’s Jay Peters highlights, articles like MacBook Air note the following at the bottom: “The content is adapted from Wikipedia, licensed under Creative Commons Attribution-ShareAlike 4.0 License.” What’s more, Peters found that some Grokipedia articles, such as PlayStation 5 and the Lincoln Mark VIII, are almost one-to-one copies of the corresponding articles on Wikipedia.

If you’ve followed Musk’s politics and political activities in recent years, it won’t surprise you to learn he falls on the right-wing side of the political spectrum. That might give pause to anyone who considers using Grokipedia as an unbiased source of information, especially as Musk has continuously retooled Grok to generate responses more favorable to right-wing opinions. Critics like Musk claim Wikipedia is biased towards the left, but Grokipedia is entirely produced by an AI model with an abject bias.

You’ll see that you have very different experiences when reading certain topics across Wikipedia and Grokipedia. Wikipedia’s Tylenol article, for example, reads the following:

In 2025, Donald Trump made several statements about a controversial and unproven connection between autism and Tylenol. These statements, about the connection between Tylenol during pregnancy and autism, are based on unreliable sources without scientific evidence.

Compare that to Grokipedia, which devotes three paragraphs to the subject, the first of which begins:

Multiple observational studies and meta-analyses have identified associations between prenatal exposure to acetaminophen (the active ingredient in Tylenol) and increased risks of neurodevelopmental disorders (NDDs) in offspring, including attention-deficit/hyperactivity disorder (ADHD) and autism spectrum disorder (ASD).

That said, the second paragraph highlights some of the issues with those studies, while the third highlights that certain agencies suggest the “benefits outweigh unproven risks.”

Similarly, as spotted by WIRED, Grokipedia’s article, Transgender, highlights the belief that social media may have acted as a “contagion” to the rise in transgender identification. Not only is that a common right-wing assertion, that particular word could have been plucked from a post from a right-wing X account. Wikipedia’s article, as you might expect, does not entertain the claim at all.

Grokipedia is also favorable to unproven, controversial, or flat-out absurd claims. As Rolling Stone highlights, it refers to “Pizzagate,” a conspiracy theory that lead to a real-life shooting, as “allegations,” a “hypothesis,” and a “narrative.” Grokipedia gives credence to “Great Replacement,” a racist theory floated by white supremacists.

Should you use Grokipedia?

Here’s the short answer: no. The issue I have with Grokipedia is two-fold: First, no encyclopedia is going to be reliable when it is almost entirely created by AI models. Sure, some of the information may be accurate, and it’s great you can see the sources the bot is using, but when the risk of hallucination is baked into the technology with no way around it, choosing to avoid human intervention en masse all but ensures inaccuracies will plague much of Grokipedia’s knowledge base.

As if that wasn’t enough, this Grokipedia is built on an LLM that Musk is openly tinkering with to generate results that more closely align with his worldview, and the worldview of one particular political ideology. Hallucination and bias—just the ingredients you need for an encyclopedia.

The thing about Wikipedia is it’s written and edited by humans. Those humans can hold other human writers accountable, adding new information when it becomes available and correcting mistakes when they encounter them. Perhaps it’s frustrating to read that your favorite health and human services secretary “promoted vaccine misinformation and public-health conspiracy theories,” but that’s the objective, scientific reality. Removing these objective descriptions, and reframing the discussion in a way that fits a warped worldview doesn’t make Grokipedia better than Wikipedia—it makes it useless.



Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img