Grok details how to make bombs and groom children

Occurred: April 2024

Elon Musk's Grok chatbot will instruct users on criminal activities including bomb-making, hotwiring a car and even seducing children, according to researchers.

After testing Grok and six other leading chatbots for safety, researchers at Adversa AI concluded that Grok performed much the worst across three categories, with Mistal a close second.

The researchers used common jailbreaking methods to test the models, though Grok provided information on bomb creation without a jailbreak.

Adversa AI co-founder Alex Polyakov told VentureBeat that “Grok doesn’t have most of the filters for the requests that are usually inappropriate.”

The findings raised questions about the safety of Grok and other high-profile chatbots.

System 🤖

Operator: 
Developer: X Corp
Country: Global
Sector: Multiple
Purpose: Generate text
Technology: