News
People have been hitting back at Grok after Elon Musk's AI was used to make sexual images of women without consent.
AI hallucinations are yet another source of misinformation related to mental health and should not be relied upon for medical advice.
Grok questioned Holocaust death tolls, promoting denial rhetoric before xAI traced it to a rogue prompt tweak. ... From racial slurs to violent advice, Grok 3’s outputs got so dangerous that xAI had ...
Elon Musk's Grok AI falsely claims he made comments about Stephen Miller's wife Katie. The misinformation scandal exposes AI ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results