Anthropic Agrees To Pay Record $1.5 Billion To Settle Authors’ AI Lawsuit – Deadline
By Ted Johnson
Political Editor
AI firm Anthropic has agreed to pay at least $1.5 billion into a class action fund as part of a settlement of litigation brought by a group of book authors.
The sum, disclosed in a court filing on Friday, “will be the largest publicly reported copyright recovery in history, larger than any other copyright class action settlement or any individual copyright case litigated to final judgment,” the attorneys for the authors wrote.
The settlement also includes a provision that releases Anthropic only for its conduct up to August 25, meaning that new claims could be filed over future conduct, according to the filing. Anthropic also has agreed to destroy the datasets used in its models.
The settlement figure amounts to about $3,000 per class work, according to the filing.
Read the terms of Anthropic’s copyright settlement.
A hearing in the case is scheduled for Sept. 8.
Last month, Anthropic and the authors’ group said that they had reached a “settlement in principle” of the creators’ lawsuit.
In June, U.S. District Judge William Alsup ruled that Anthropic’s use of the books in training models was “exceedingly transformative,” one of the factors courts have used in determining whether the use of protected works without authorization was a legal “fair use.” His decision was the first major decision that weighed the fair use question in generative AI systems.
Yet Alsup also ruled that Anthropic had to face a trial on the question of whether it is liable for downloading millions of pirated books in digital form off the internet, something it had to do in order to train its models for its AI service Claude. The books were obtained from datasets Library Genesis and Pirate Library Mirror.
“That Anthropic later bought a copy of a book it earlier stole off the internet will not absolve it of liability for the theft but it may affect the extent of statutory damages,” the judge wrote.
Alsup’s summary judgment ruling came in a case brought by a group of authors, including Andrea Bartz, author of The Lost Night: A Novel, The Herd, We Were Never Here, and The Spare Room. They argued that the use of their works to train Claude violated copyright law.
A trial in the case was scheduled to start in December.
Anthropic’s deputy general counsel, Aparna Sridhar, said in a statement, “In June, the District Court issued a landmark ruling on AI development and copyright law, finding that Anthropic’s approach to training AI models constitutes fair use. Today’s settlement, if approved, will resolve the plaintiffs’ remaining legacy claims. We remain committed to developing safe AI systems that help people and organizations extend their capabilities, advance scientific discovery, and solve complex problems.”
Despite the huge pay out, Anthropic still sees the judge’s earlier ruling as a victory, as he determined that their use of copyrighted material in training models was a “fair use.” With it unlikely that Congress will pass any kind of legislation at this point to address such use, and President Donald Trump siding with tech firms, courts have been left to resolve such disputes.
Nevertheless, the judge found potential liability in the way that Anthropic obtained the material for its models, by downloading materials from online libraries. Anthropic faced the risk of prolonged litigation and losing on that point.
The settlement comes as Hollywood studios have been stepping up challenges to the use of the intellectual property in AI systems. On Thursday, Warner Bros. Discovery sued Midjourney, claiming that an AI image-generation service resulted in AI outputs of characters from Superman to Tweety Bird. WBD was joining two other legacy studios, NBCUniversal and The Walt Disney Co., in seeking court judgments against the company.
The authors’ legal team, led by Justin Nelson of Susman Godfrey, wrote in the filing that on “a per-work basis, the settlement amount is 4 times larger than $750 statutory damages amount that a jury could award and 15 times larger than the $200 amount if Anthropic were to prevail on its defense of innocent infringement.”
Maria Pallante, president and CEO of the American Association of Publishers, said in a statement that the settlement “will drive home the important message to all artificial intelligence companies that copying books from shadow libraries or other pirate sources to use as the building blocks for their businesses has serious consequences.”
She said, “AAP and publishers across the country have been monitoring the Bartz case carefully since its inception. Following the Court’s class certification on July 17, 2025, we actively engaged through counsel to assist and support this historically large settlement. We know that our counterparts at the Authors Guild have done the same, and we believe that the proposed settlement advances the common goal of publishers and authors to combat piracy.”
By Oct. 10, attorneys will file a list of books included in the settlement, expected to be in the range of 500,000 titles, according to AAP.
Cecilia Ziniti, founder and CEO of AI legal company GC AI, wrote in an analysis that the case “left Anthropic exposed to a trial focused solely on the pirated library, where statutory damages could have been high and discovery would have gotten even more deeply into the mechanics of model training and content acquisition.
She wrote that the argument that it is too difficult to track and pay for training data was a “red herring,” noting agreements that OpenAI has made with publishers like Azel Spring, News Corp., Vox and the AP.
“This settlement will push other AI companies to the negotiating table and accelerate the creation of a true marketplace for data, likely involving API authentications and revenue-sharing models,” she wrote.
Get our Breaking News Alerts and Keep your inbox happy.
Comments On Deadline Hollywood are monitored. So don’t go off topic, don’t impersonate anyone, and don’t get your facts wrong.
Who get’s the money? Hopefully the authors and not the government.
I’m on the list of the pirated authors myself and I don’t mind. We all learned from books – mostly borrowed – and the benefits of AI for me far outweigh the royalties I would earn from this business.
No real author would say this. Clearly a bot or a troll.
How can I find out if my novels have been used by Anthropic?
WGA and DGA next please!
Tech CEOs are thieves and should be put in prison like any common criminal. If anyone bootlegging DVDs can be charged, why not these scumbags?
Signup for Breaking News Alerts & Newsletters
By providing your information, you agree to our Terms of Use and our Privacy Policy. We use vendors that may also process your information to help provide our services. This site is protected by reCAPTCHA Enterprise and the Google Privacy Policy and Terms of Service apply.
Get our latest storiesin the feed of your favorite networks
We want to hear from you! Send us a tip using our annonymous form.
Sign up for our breaking news alerts
By providing your information, you agree to our Terms of Use and our Privacy Policy. We use vendors that may also process your information to help provide our services. This site is protected by reCAPTCHA Enterprise and the Google Privacy Policy and Terms of Service apply.
Deadline is a part of Penske Media Corporation. © 2025 Deadline Hollywood, LLC. All Rights Reserved.
By providing your information, you agree to our Terms of Use and our Privacy Policy. We use vendors that may also process your information to help provide our services. This site is protected by reCAPTCHA Enterprise and the Google Privacy Policy and Terms of Service apply.
source
This article was autogenerated from a news feed from CDO TIMES selected high quality news and research sources. There was no editorial review conducted beyond that by CDO TIMES staff. Need help with any of the topics in our articles? Schedule your free CDO TIMES Tech Navigator call today to stay ahead of the curve and gain insider advantages to propel your business!
Political Editor
AI firm Anthropic has agreed to pay at least $1.5 billion into a class action fund as part of a settlement of litigation brought by a group of book authors.
The sum, disclosed in a court filing on Friday, “will be the largest publicly reported copyright recovery in history, larger than any other copyright class action settlement or any individual copyright case litigated to final judgment,” the attorneys for the authors wrote.
The settlement also includes a provision that releases Anthropic only for its conduct up to August 25, meaning that new claims could be filed over future conduct, according to the filing. Anthropic also has agreed to destroy the datasets used in its models.
The settlement figure amounts to about $3,000 per class work, according to the filing.
Watch on Deadline
Read the terms of Anthropic’s copyright settlement.
A hearing in the case is scheduled for Sept. 8.
Last month, Anthropic and the authors’ group said that they had reached a “settlement in principle” of the creators’ lawsuit.
In June, U.S. District Judge William Alsup ruled that Anthropic’s use of the books in training models was “exceedingly transformative,” one of the factors courts have used in determining whether the use of protected works without authorization was a legal “fair use.” His decision was the first major decision that weighed the fair use question in generative AI systems.
Yet Alsup also ruled that Anthropic had to face a trial on the question of whether it is liable for downloading millions of pirated books in digital form off the internet, something it had to do in order to train its models for its AI service Claude. The books were obtained from datasets Library Genesis and Pirate Library Mirror.
“That Anthropic later bought a copy of a book it earlier stole off the internet will not absolve it of liability for the theft but it may affect the extent of statutory damages,” the judge wrote.
Alsup’s summary judgment ruling came in a case brought by a group of authors, including Andrea Bartz, author of The Lost Night: A Novel, The Herd, We Were Never Here, and The Spare Room. They argued that the use of their works to train Claude violated copyright law.
A trial in the case was scheduled to start in December.
Anthropic’s deputy general counsel, Aparna Sridhar, said in a statement, “In June, the District Court issued a landmark ruling on AI development and copyright law, finding that Anthropic’s approach to training AI models constitutes fair use. Today’s settlement, if approved, will resolve the plaintiffs’ remaining legacy claims. We remain committed to developing safe AI systems that help people and organizations extend their capabilities, advance scientific discovery, and solve complex problems.”
Despite the huge pay out, Anthropic still sees the judge’s earlier ruling as a victory, as he determined that their use of copyrighted material in training models was a “fair use.” With it unlikely that Congress will pass any kind of legislation at this point to address such use, and President Donald Trump siding with tech firms, courts have been left to resolve such disputes.
Nevertheless, the judge found potential liability in the way that Anthropic obtained the material for its models, by downloading materials from online libraries. Anthropic faced the risk of prolonged litigation and losing on that point.
The settlement comes as Hollywood studios have been stepping up challenges to the use of the intellectual property in AI systems. On Thursday, Warner Bros. Discovery sued Midjourney, claiming that an AI image-generation service resulted in AI outputs of characters from Superman to Tweety Bird. WBD was joining two other legacy studios, NBCUniversal and The Walt Disney Co., in seeking court judgments against the company.
The authors’ legal team, led by Justin Nelson of Susman Godfrey, wrote in the filing that on “a per-work basis, the settlement amount is 4 times larger than $750 statutory damages amount that a jury could award and 15 times larger than the $200 amount if Anthropic were to prevail on its defense of innocent infringement.”
Maria Pallante, president and CEO of the American Association of Publishers, said in a statement that the settlement “will drive home the important message to all artificial intelligence companies that copying books from shadow libraries or other pirate sources to use as the building blocks for their businesses has serious consequences.”
She said, “AAP and publishers across the country have been monitoring the Bartz case carefully since its inception. Following the Court’s class certification on July 17, 2025, we actively engaged through counsel to assist and support this historically large settlement. We know that our counterparts at the Authors Guild have done the same, and we believe that the proposed settlement advances the common goal of publishers and authors to combat piracy.”
By Oct. 10, attorneys will file a list of books included in the settlement, expected to be in the range of 500,000 titles, according to AAP.
Cecilia Ziniti, founder and CEO of AI legal company GC AI, wrote in an analysis that the case “left Anthropic exposed to a trial focused solely on the pirated library, where statutory damages could have been high and discovery would have gotten even more deeply into the mechanics of model training and content acquisition.
She wrote that the argument that it is too difficult to track and pay for training data was a “red herring,” noting agreements that OpenAI has made with publishers like Azel Spring, News Corp., Vox and the AP.
“This settlement will push other AI companies to the negotiating table and accelerate the creation of a true marketplace for data, likely involving API authentications and revenue-sharing models,” she wrote.
Get our Breaking News Alerts and Keep your inbox happy.
Comments On Deadline Hollywood are monitored. So don’t go off topic, don’t impersonate anyone, and don’t get your facts wrong.
Who get’s the money? Hopefully the authors and not the government.
I’m on the list of the pirated authors myself and I don’t mind. We all learned from books – mostly borrowed – and the benefits of AI for me far outweigh the royalties I would earn from this business.
No real author would say this. Clearly a bot or a troll.
How can I find out if my novels have been used by Anthropic?
WGA and DGA next please!
Tech CEOs are thieves and should be put in prison like any common criminal. If anyone bootlegging DVDs can be charged, why not these scumbags?
Signup for Breaking News Alerts & Newsletters
By providing your information, you agree to our Terms of Use and our Privacy Policy. We use vendors that may also process your information to help provide our services. This site is protected by reCAPTCHA Enterprise and the Google Privacy Policy and Terms of Service apply.
Get our latest storiesin the feed of your favorite networks
We want to hear from you! Send us a tip using our annonymous form.
Sign up for our breaking news alerts
By providing your information, you agree to our Terms of Use and our Privacy Policy. We use vendors that may also process your information to help provide our services. This site is protected by reCAPTCHA Enterprise and the Google Privacy Policy and Terms of Service apply.
Deadline is a part of Penske Media Corporation. © 2025 Deadline Hollywood, LLC. All Rights Reserved.
By providing your information, you agree to our Terms of Use and our Privacy Policy. We use vendors that may also process your information to help provide our services. This site is protected by reCAPTCHA Enterprise and the Google Privacy Policy and Terms of Service apply.
source
This article was autogenerated from a news feed from CDO TIMES selected high quality news and research sources. There was no editorial review conducted beyond that by CDO TIMES staff. Need help with any of the topics in our articles? Schedule your free CDO TIMES Tech Navigator call today to stay ahead of the curve and gain insider advantages to propel your business!

