News
Asked to identify the race of an alleged killer, the “anti-woke” chatbot Grok gets its facts wrong.
In the latest example of odd behavior by the Elon Musk-owned AI service Grok, the “anti-woke” chatbot falsely told an X user that the suspect in a fatal San Francisco stabbing was a Black man. In reality, the suspect in the July 26 killing of Colden Kimber, Sean Wei Collins, is Asian and white.
The now-deleted Grok answer appeared in response to a user’s question under a post from The Standard’s official X account. The Grok response accused The Standard of bias by omitting racial information about the suspect.
Collins is of mixed-race background, his lawyer Bill Fazio told The Standard. His mother is Chinese American and his father is white. He is seen teaching how to draw a dinosaur in a video posted to YouTube in 2020 by his mother, Lin Wei, an artist and teacher.
Paramedics found Kimber slumped at a Muni stop on Ocean Avenue, bleeding from a stab wound to the neck. Officers arrested Collins and recovered a bloody sweater and a knife with a 6-inch blade several blocks away.
According to prosecutors, Kimber proactively stood between Collins and a group of women and children shortly before the unprovoked attack, after the group became frightened by the suspect’s behavior.
The San Francisco district attorney’s office on Wednesday charged Collins with murder and child endangerment, citing two children who were witnesses to the attack.
Kimber is mourned as a beloved...
Read Full Story:
https://news.google.com/rss/articles/CBMihgFBVV95cUxOOHhrZzU1VVJSX1NiWGlSVlg2...