David Greene Sues Google Over Alleged AI Voice Theft in NotebookLM Podcast Tool

Veteran public radio journalist David Greene has filed a lawsuit against Google, claiming the tech giant unlawfully replicated his voice for its AI product, NotebookLM.

The case, filed in California, adds to a growing wave of litigation challenging how artificial intelligence models are trained — particularly when they appear to mimic the voices, likenesses, or creative outputs of real individuals without consent.

At the center of the dispute is NotebookLM’s AI-generated podcast feature, which Greene says sounds so much like him that it left him “completely freaked out.”

Google NotebookLM AI podcast interface screen

Table of Contents

The Core Allegation: “It Sounds Like Me”

According to the complaint, Greene alleges that the male AI podcast voice featured in Google’s NotebookLM product replicates:

Greene says the resemblance was first pointed out by friends, family members, and professional colleagues. After listening himself, he says he was stunned.

“It’s this eerie moment where you feel like you’re listening to yourself,” Greene reportedly told the Washington Post.

For someone whose career has revolved around broadcasting, storytelling, and spoken journalism, Greene argues that his voice is not just a tool — it is his professional identity.


Why This Lawsuit Matters in 2026

This lawsuit is not merely about one journalist and one tech product. It lands at a pivotal moment when AI-generated media is expanding rapidly across:

Voice-cloning technology has become increasingly realistic, blurring the line between synthetic and human speech. And with AI podcasting tools growing in popularity, the legal boundaries remain unsettled.

Legal experts say this case could influence:


Who Is David Greene?

https://cloudfront-us-east-1.images.arcpublishing.com/wapo/FFKDBHUSINMQFITQRK6CHNRZOY.JPG
https://npr.brightspotcdn.com/legacy/sites/idaho/files/201502/DavidGreene_NPR2.jpg
https://images.ctfassets.net/2658fe8gbo8o/42a5d43706e5280dc316c3ace1836dcb-photo-asset/8db53525f001ecdf31592ae48e3cd623/_MG_51765.jpg?fit=fill&fm=webp&h=630&q=80&w=1200

4

Before examining the legal dimensions, it’s important to understand Greene’s broadcasting legacy.

Greene is best known for serving as host of NPR’s flagship morning program:

During his tenure, he became one of the most recognizable voices in American public radio.

Currently, Greene hosts multiple programs, including:

For more than a decade, his voice has been synonymous with political analysis, global reporting, and sports storytelling.

That recognition, Greene argues, makes the alleged AI replication especially concerning.


What Exactly Is NotebookLM?

NotebookLM is Google’s AI-powered research and content tool that can:

The product includes a feature that generates a male virtual podcast co-host voice. That voice is what Greene claims mirrors his own.

Google, however, strongly denies the accusation.


Google’s Response: “Baseless Allegations”

In a statement, Google called the lawsuit unfounded.

A company spokesperson stated that:

The male voice used in NotebookLM’s Audio Overviews is based on a paid professional actor hired by Google.

The company denies training its AI model on Greene’s voice recordings and rejects any claim of unauthorized usage.

Google maintains that:


The Legal Filing: What the Complaint Says

The lawsuit was filed in January in Santa Clara County Superior Court — the heart of Silicon Valley.

Key points in the complaint include:

Notably, the lawsuit does not provide direct proof that Greene’s voice recordings were used in AI training. However, it references analysis from an AI forensic firm.


AI Forensic Analysis: The 53–60% Match

The complaint cites a third-party AI forensic firm that used proprietary software to compare:

The firm reportedly concluded there was a 53% to 60% confidence level that Greene’s voice was used in training the model.

While the firm considers that range “relatively high” for comparisons between a human and artificial voice, critics argue that such percentages do not constitute definitive proof.

Legal observers note that courts may require far more conclusive evidence to establish liability.


“My Voice Is the Most Important Part of Who I Am”

Greene’s concern extends beyond financial damages.

He has publicly expressed fear that AI-generated audio resembling him could be used to:

He reportedly cited a Guardian article about AI podcast tools being used to amplify questionable claims.

His central argument: If something sounds like him, audiences may assume he endorses it.

In an era of deepfakes and misinformation, that fear is not unfounded.


The Broader Pattern: AI Voice Lawsuits Are Growing

Greene’s case is part of a larger trend in 2025–2026 involving AI voice imitation disputes.

Recent controversies have included:

As generative AI tools become more accessible, celebrities, journalists, and public figures are increasingly concerned about:


Legal Representation: High-Profile Counsel

Greene is represented by Joshua Michelangelo Stein, a partner at:

The firm is also known for involvement in high-profile AI copyright litigation, including cases involving authors and major tech companies.

Stein has encouraged the public to listen to the sample audio and draw their own conclusions.


The Unsettled Question: Who Owns a Voice in the Age of AI?

This lawsuit raises a fundamental legal question:

Is a person’s voice protected intellectual property?

While U.S. law recognizes “right of publicity” protections covering name and likeness, voice rights are more complex. Courts have previously ruled in favor of singers and actors whose voices were imitated in advertising.

However, AI-generated voices complicate matters because:

If Greene prevails, tech companies may need to:


Potential Impacts on the AI Industry

If courts find in favor of Greene, consequences could include:

If Google wins, it could strengthen the legal defense that AI-generated outputs are independent creations rather than direct copies.

Either way, the ruling may set precedent nationwide.


Public Reaction and Industry Debate

The case has sparked online debate among:

Some argue that:

Others believe:


The Road Ahead

The lawsuit is currently in early stages. The court will likely examine:

The case may take months — possibly years — to resolve.

Meanwhile, the AI industry continues expanding at breakneck speed.


Key Takeaways


Why This Story Is Trending Now

Artificial intelligence is no longer experimental — it’s embedded in everyday tools.

As AI-generated audio becomes mainstream, disputes like this highlight a tension between:

Whether Greene’s case succeeds or fails, it underscores a broader reality: the legal system is still catching up with artificial intelligence.

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!