sue ChatGPT for defamation

Can You Sue ChatGPT for Defamation? Man Sues OpenAI For Saying He Killed Children

Could AI be held legally responsible for defamation? A US man is putting that question to the test. In a world increasingly dominated by artificial intelligence, one case out of the United States is making waves and raising alarm bells. A man named Mark Walters has filed a defamation lawsuit against OpenAI, the creators of ChatGPT, after the AI chatbot generated an entirely false narrative claiming. It claimed he was a convicted criminal who murdered his own children.

The allegations made by the chatbot were not only deeply disturbing, but also utterly untrue. Which begs the question, can you sue ChatGPT for defamation?

We are expert defamation lawyers, read more about our work here.

What Did ChatGPT Say?

According to court documents filed in Georgia, journalist Mark Walters was researching a federal court case when he asked ChatGPT for a summary of a complaint involving another individual, Jonathan Bailey.

But instead of summarising Bailey’s case, ChatGPT generated a fictional court document. It claimed that Walters was a defendant and that he had been charged with embezzlement, fraud, and the murder of his own children using a gun. There is no truth to any of these claims. Walters has no criminal record, and the story was a complete fabrication.

The AI chatbot cited a non-existent complaint, even falsely attributing it to the Second Amendment Foundation, a real organisation that had no involvement in the matter.

The Legal Fallout: Can you sue ChatGPT for defamation?

Walters is now suing OpenAI for defamation. His legal team argues that the company is responsible for the harm caused by the AI’s false statements, despite the fact that ChatGPT is known to sometimes generate “hallucinations,” or inaccurate responses that sound plausible.

Under US law, defamation involves the publication of false statements that damage a person’s reputation. While traditional defamation law typically involves human publishers, such as media outlets or individuals, this case brings up a cutting-edge legal question: Can an AI chatbot be held liable for defamation?

OpenAI has yet to formally respond to the suit. But the case could set an important precedent for how AI-generated content is treated under defamation and tort law, not just in the US, but globally.

What Does This Mean for Australia?

While this particular case is unfolding in the United States, it raises important questions for Australian law, especially as AI tools like ChatGPT become more widely used by professionals, journalists, and the general public.

Australia has some of the strictest defamation laws in the world. Under the Defamation Act 2005 (NSW) and corresponding legislation in other states, a person can bring a defamation claim if:

  • A statement is published to a third party;
  • The statement identifies the person; and
  • The statement is defamatory — that is, it harms the person’s reputation.

If similar false allegations were made about an Australian citizen via ChatGPT. For example, if a user shared the AI’s response with others, the person could potentially bring a defamation claim, depending on the context.

However, there remains a grey area around who is legally responsible: the AI tool, its developers, the user who prompted it, or a third party who shared the false information?

sue ChatGPT for defamation

Defamation in the Age of AI

This case highlights a growing risk as more people rely on AI tools for research, content creation, and decision-making. When ChatGPT or any AI gets it wrong, the consequences can be severe. For individuals who are falsely accused of crimes or otherwise defamed, the damage to reputation and mental health can be long-lasting.

From a legal perspective, the Walters v OpenAI case could help clarify:

  • Whether AI developers can be held liable for defamatory “hallucinations”;
  • What duty of care, if any, these companies owe to people mentioned in generated content;
  • How traditional defamation law applies to emerging technologies.

Need to sue ChatGPT for defamation?

At O’Brien Criminal and Civil Law, we have extensive experience helping clients defend their reputation through defamation proceedings. Whether you’ve been defamed online, in the media, or as in this emerging area through AI-generated content, our expert defamation lawyers can advise you on your rights and options.

If you or someone you know needs legal assistance, contact us today. Our experienced team is ready to help.

📞 Phone: (02) 9261 4281
📧 Email: 

Read our successful defamation case studies. 

NB
author avatar
Nicole Byrne

Recommended articles

Search

O’Brien Criminal & Civil Solicitors
e: 
p: 02 9261 4281
a: Level 4, 219-223 Castlereagh St,
Sydney NSW 2000

Scroll to Top