BBC warns Perplexity to stop using its content or face legal action

BBC demands Perplexity stop scraping its content, delete training data, and propose financial compensation

The BBC has threatened legal action against U.S.-based artificial intelligence firm Perplexity, accusing the company of reproducing its content without permission.

The broadcaster said Perplexity’s chatbot has been using BBC content “verbatim” and demanded the immediate removal of all material, along with financial compensation for past use.

This marks the first time the BBC has pursued such legal measures against an AI company. A legal letter has been sent to Perplexity CEO Aravind Srinivas, alleging copyright infringement under UK law and violations of the BBC’s terms of use.

In response, Perplexity issued a brief statement accusing the BBC of attempting to protect Google’s illegal monopoly.

The BBC cited earlier research showing that Perplexity and other chatbots have misrepresented or inaccurately summarized BBC content, falling short of the broadcaster’s editorial standards for impartiality and accuracy. The BBC argued that such misrepresentation harms its reputation and erodes public trust, especially among UK licence fee payers.

The dispute comes amid growing industry concerns over AI platforms using copyrighted material without consent. Generative AI tools often rely on data gathered through web scraping—automated bots that extract content from websites, sometimes ignoring directives intended to block such activity.

The BBC says it has disabled access to its content via two of Perplexity’s web crawlers using the “robots.txt” file, but claims the company has continued to scrape its material regardless. Perplexity has denied that its bots ignore “robots.txt” instructions, saying it does not use web content to pre-train foundation AI models.

Perplexity describes its chatbot as an “answer engine” that pulls information from the web and presents it in summarized form. It encourages users to verify responses independently.

In January, Apple suspended an AI-generated news feature that created false BBC headlines, following a complaint from the broadcaster.

The Professional Publishers Association, which represents more than 300 UK media brands, expressed support for the BBC’s action, warning that illegal scraping threatens the £4.4 billion publishing industry and the 55,000 jobs it supports.

The dispute highlights growing legal and ethical questions around the use of publisher content by AI firms and has intensified calls for the UK government to strengthen copyright protections.

Monitoring Desk
Monitoring Desk
Our monitoring team diligently searches the vast expanse of the web to carefully handpick and distill top-tier business and economic news stories and articles, presenting them to you in a concise and informative manner.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read

FBR says Finance Bill 2025 still under discussion in parliament

FBR says it clarified the amendments as some media reports suggest they are not well understood by the public