A recent investigation by the BBC revealed substantial inaccuracies in AI-generated responses to news-related queries, raising alarms about the technology’s reliability. The study found that 51% of all AI answers about current events contained “significant issues,” including failures to distinguish between factual reporting and opinion.
Key Findings:
- 19% of AI answers citing BBC content introduced factual errors, such as incorrect dates, statistics, or misrepresented events.
- 13% of quotes attributed to BBC articles were either altered or entirely absent from the original sources.
- Specific AI platforms, including Perplexity AI and Microsoft’s Copilot, were flagged for critical errors. Perplexity AI altered quotes from sources, while Copilot relied on outdated articles (e.g., a 2022 piece) to summarize breaking news.
In response, Apple temporarily halted its AI-driven news summarization feature after the BBC alerted the company to inaccuracies.

BBC’s Call to Action:
The broadcaster outlined three measures to address AI reliability:
- Regular evaluations of AI outputs for accuracy.
- Constructive dialogue between news organizations and AI developers.
- Enhanced regulations for large language models (LLMs) to ensure transparency and accountability.
Debate Over Regulation:
While BBC CEO Deborah Turness emphasized the need for collaboration among governments, tech firms, and media to combat “distortion” and misinformation, some political figures, including U.S. Vice President J.D. Vance, cautioned against stifling innovation. At the Artificial Intelligence Action Summit in Paris, Vance warned that “excessive regulation could kill a transformative industry” and advocated for a deregulatory approach. Which industries are being impacted or are at risk of disappearing due to AI?
Turness, however, stressed urgency: “Society functions on a shared understanding of facts. Inaccuracies from AI tools can cause real harm, especially when amplified on social platforms. We must ensure this technology helps people find trusted information, not add to chaos.”
Industry Response:
The BBC urged AI companies to prioritize accuracy and transparency, citing trust as a cornerstone of credible journalism. Several leading AI developers, including OpenAI, Microsoft, Google, and Perplexity AI, were contacted for comment but have not yet publicly addressed the findings.
Conclusion:
As AI becomes increasingly integrated into news consumption, the BBC’s report underscores the pressing need for safeguards to protect factual integrity. The broadcaster has pledged to lead collaborative efforts to ensure AI tools enhance, rather than erode, public trust in media.