"Hey @Grok, is this true?" – the question has become a go-to for X users trying to fact-check viral posts. Since its launch in November 2023, Elon Musk's AI chatbot Grok has been used as a rapid-response fact-checker for users on the platform.
In July, xAI, Musk's artificial intelligence company, came under fire after Grok generated controversial responses, including ones flagged as anti-Semitic. This came just weeks after Musk announced he would rebuild the bot, saying it had become too politically correct. Soon after, xAI promised major upgrades to improve Grok's performance "within a few days."
But now, as images and videos of malnourished children from Gaza circulate online, Grok has repeatedly provided misleading or false claims about the origin.
DW Fact check reviewed Grok's claims and found several inaccuracies in posts it responded to.
Grok says viral image of Gaza girl is actually from Iraq
Claim: A viral image shared by many online, of a young girl begging for food in Gaza was wrongly identified by Grok as a photo from Iraq taken in 2014.
The image was shared by an X account with nearly 2 million followers, claiming it showed the humanitarian crisis in Gaza. A user replied to the post, tagging Grok and asking for verification.
To which Grok responded:
"This photo was taken on August 10, 2014, near Sinjar mountain in Iraq, showing a Yazidi girl fleeing ISIS violence on a truck. It's not from Gaza in 2025, despite recent misattributions."
Following this, Grok's...
Read Full Story:
https://news.google.com/rss/articles/CBMikwFBVV95cUxOdFlKNzVsVnFESXZXc21LNFlC...