arthwollipot
Observer of Phenomena, Pronouns: he/him
For those who don't know, Grokipedia (no, I'm not going to bother to link to it) is Elon Musks "replacement" for Wikipedia. It is supposed to be an AI-generated encyclopedia of the internet.
Unfortunately, as we all know, AI cannot distinguish between fact and fiction. Grokipedia contains many factual errors and right-wing talking points, which is appropriate because so does Grok, the LLM that it's built over. Here's what academics have found when closely examining the information on Grokipedia:
www.theguardian.com
Wikipedia cites its sources and edits are discussed and considered transparently amongst its human editors. A change to a Wikipedia page is quickly reverted if it is not supported by evidence. Grokipedia should not be considered a reliable source for factual or objective information, any more than any LLM chatbot should.
Unfortunately, as we all know, AI cannot distinguish between fact and fiction. Grokipedia contains many factual errors and right-wing talking points, which is appropriate because so does Grok, the LLM that it's built over. Here's what academics have found when closely examining the information on Grokipedia:
In Grok we don’t trust: academics assess Elon Musk’s AI-powered encyclopedia
From publishing falsehoods to pushing far-right ideology, Grokipedia gives chatroom comments equal status to research
(Sir Richard) Evans, however, was discovering that Musk’s use of AI to weigh and check facts was suffering a more earth-bound problem. “Chatroom contributions are given equal status with serious academic work,” Evans, an expert on the Third Reich, told the Guardian, after being invited to test out Grokipedia. “AI just hoovers up everything.”
Andrew Dudfield, the head of AI at Full Fact, a UK-based factchecking organisation, said: “We really have to consider whether an AI-generated encyclopedia – a facsimile of reality, run through a filter – is a better proposition than any of the previous things that we have. It doesn’t display the same transparency but it is asking for the same trust. It is not clear how far the human hand is involved, how far it is AI=generated and what content the AI was trained on. It is hard to place trust in something when you can’t see how those choices are made.”
...many of the 885,279 articles available on Grokipedia in its first week were lifted almost word for word from Wikipedia, including its entries on the PlayStation 5, the Ford Focus and Led Zeppelin. Others, however, differed significantly:
- Grokipedia’s entry on the Russian invasion of Ukraine cited the Kremlin as a prominent source and quoted the official Russian terminology about “denazifying” Ukraine, protecting ethnic Russians and neutralising threats to Russian security. By contrast, Wikipedia said Putin espoused imperialist views and “baselessly claimed that the Ukrainian government were neo-Nazis”.
- Grokipedia called the far-right organisation Britain First a “patriotic political party”, which pleased its leader, Paul Golding, who in 2018 was jailed for anti-Muslim hate crimes. Wikipedia, on the other hand, called it “neo-fascist” and a “hate group”.
- Grokipedia called the 6 January 2021 turmoil at the US Capitol in Washington DC a “riot”, not an attempted coup, and said there were “empirical underpinnings” to the idea that a deliberate demographic erasure of white people in western nations is being orchestrated through mass immigration. This is a notion that critics consider to be a conspiracy theory.
- Grokipedia said Donald Trump’s conviction for falsifying business records in the Stormy Daniels hush-money case was handed down “after a trial in a heavily Democratic jurisdiction”, and there was no mention of his conflicts of interest – for example receiving a jet from Qatar or the Trump family cryptocurrency businesses.
Wikipedia cites its sources and edits are discussed and considered transparently amongst its human editors. A change to a Wikipedia page is quickly reverted if it is not supported by evidence. Grokipedia should not be considered a reliable source for factual or objective information, any more than any LLM chatbot should.