Grok shines and reverses badly about bondi beach shooting

Elon Musk’s Ayi Chatbot Grok is back again.
This time, among other problems, the Chatbot is able to let loose about the Bondi Beach shooting, where at least eleven people were killed at a Hanukkah gathering.
One of the attackers was often linked to Busstandent, identified as 43-year-old Ahmed. The video of the interaction was widely shared on social media praising the man’s bravery. Except for those who jumped at the opportunity to exploit the tragedy and spread Islamophobia, especially by denying the authenticity of the reports pointing to Bussunder.
Grok does not help the situation. The chatbot appears to have, at least since Sunday morning, responded to user questions with unrelated or sometimes completely incorrect answers.
In response to the user who asked grok the story behind the video showing al ahmed confronting the shooter of the man who is climbing, even if he is adged adven without a place to be set. It may damage the setting. “
In another example, Grok claimed that the photo showing al ahmed injured, was an Israeli hostage taken by Hamas on October 7.
In response to another user’s question, Grok questioned the authenticity of Al AhMed’s contention
In another example, Grok described a video clearly tagged as a Tweet to show a shootout between attackers and police in Sydney instead, which devastated Australia earlier this year. Although this time, the user doubled down on the response to ask the grok to look again, causing the chatbot to realize its mistake.
Aside from the details of getting it wrong, Grok seems really confused. One user was served a summary of the bond shooting and its collapse in response to a question about the Tech Company Oracle. It also seems to confuse the confusing information regarding the bondi shooting and the brown university shooting that happened only before the attack on Australia.
The glitch also extends beyond bondi shooting. On Sunday morning, Grok gave famous footballers the wrong way, giving information about the use of acetaminophen in pregnancy when asked about Project 2025 and the practices of Kamala Harris running against the British Law.
It is not clear what is causing the cold. Gizmodo reached out to Grok-Developer Xai for comment, but he only responded with the usual default response, “the media assets are asleep.”
And it’s not the first time that Grok has literally lost the truth. The Chatbot has given several questionable answers this year, from the “different conversion” that caused it to answer all questions with the revoricy theys “in South Africa to say that it is obvious that it is the wisdom of the whole world.



