David Greene Sues Google Over Alleged AI Voice Theft in NotebookLM Podcast Tool
Veteran public radio journalist David Greene has filed a lawsuit against Google, claiming the tech giant unlawfully replicated his voice for its AI product, NotebookLM.
The case, filed in California, adds to a growing wave of litigation challenging how artificial intelligence models are trained — particularly when they appear to mimic the voices, likenesses, or creative outputs of real individuals without consent.
At the center of the dispute is NotebookLM’s AI-generated podcast feature, which Greene says sounds so much like him that it left him “completely freaked out.”
According to the complaint, Greene alleges that the male AI podcast voice featured in Google’s NotebookLM product replicates:
His vocal cadence
His conversational pacing
His tonal inflections
Even filler phrases like “uh” and “like”
Greene says the resemblance was first pointed out by friends, family members, and professional colleagues. After listening himself, he says he was stunned.
“It’s this eerie moment where you feel like you’re listening to yourself,” Greene reportedly told the Washington Post.
For someone whose career has revolved around broadcasting, storytelling, and spoken journalism, Greene argues that his voice is not just a tool — it is his professional identity.
Why This Lawsuit Matters in 2026
This lawsuit is not merely about one journalist and one tech product. It lands at a pivotal moment when AI-generated media is expanding rapidly across:
Podcasts
Audiobooks
News summaries
Political commentary
Advertising
Voice-cloning technology has become increasingly realistic, blurring the line between synthetic and human speech. And with AI podcasting tools growing in popularity, the legal boundaries remain unsettled.
Legal experts say this case could influence:
AI training transparency standards
Voice rights and biometric ownership laws
Corporate disclosure obligations
Compensation frameworks for voice replication
Who Is David Greene?
4
Before examining the legal dimensions, it’s important to understand Greene’s broadcasting legacy.
Greene is best known for serving as host of NPR’s flagship morning program:
NPR
“Morning Edition” (2012–2020)
During his tenure, he became one of the most recognizable voices in American public radio.
“Sports in America with David Greene” for PRX and WHYY
“David Greene Is Obsessed” for Campside Media
For more than a decade, his voice has been synonymous with political analysis, global reporting, and sports storytelling.
That recognition, Greene argues, makes the alleged AI replication especially concerning.
What Exactly Is NotebookLM?
NotebookLM is Google’s AI-powered research and content tool that can:
Summarize documents
Generate audio overviews
Create conversational-style podcast segments
Transform written material into spoken dialogue
The product includes a feature that generates a male virtual podcast co-host voice. That voice is what Greene claims mirrors his own.
Google, however, strongly denies the accusation.
Google’s Response: “Baseless Allegations”
In a statement, Google called the lawsuit unfounded.
A company spokesperson stated that:
The male voice used in NotebookLM’s Audio Overviews is based on a paid professional actor hired by Google.
The company denies training its AI model on Greene’s voice recordings and rejects any claim of unauthorized usage.
Google maintains that:
The voice is synthetic
It was professionally developed
It does not use Greene’s recordings
The Legal Filing: What the Complaint Says
The lawsuit was filed in January in Santa Clara County Superior Court — the heart of Silicon Valley.
Key points in the complaint include:
Allegation of unauthorized voice replication
Claim of violation of publicity and voice rights
Assertion that Greene’s professional identity has been compromised
Concern over potential misuse of the AI voice to generate content he would never endorse
Notably, the lawsuit does not provide direct proof that Greene’s voice recordings were used in AI training. However, it references analysis from an AI forensic firm.
AI Forensic Analysis: The 53–60% Match
The complaint cites a third-party AI forensic firm that used proprietary software to compare:
Greene’s recorded voice
The NotebookLM AI voice
The firm reportedly concluded there was a 53% to 60% confidence level that Greene’s voice was used in training the model.
While the firm considers that range “relatively high” for comparisons between a human and artificial voice, critics argue that such percentages do not constitute definitive proof.
Legal observers note that courts may require far more conclusive evidence to establish liability.
He has publicly expressed fear that AI-generated audio resembling him could be used to:
Spread misinformation
Promote conspiracy theories
Lend credibility to controversial narratives
He reportedly cited a Guardian article about AI podcast tools being used to amplify questionable claims.
His central argument: If something sounds like him, audiences may assume he endorses it.
In an era of deepfakes and misinformation, that fear is not unfounded.
The Broader Pattern: AI Voice Lawsuits Are Growing
Greene’s case is part of a larger trend in 2025–2026 involving AI voice imitation disputes.
Recent controversies have included:
Claims that a ChatGPT voice resembled Scarlett Johansson
Deepfake advertisements allegedly mimicking Taylor Swift
As generative AI tools become more accessible, celebrities, journalists, and public figures are increasingly concerned about:
Voice cloning
Digital impersonation
Unauthorized likeness usage
Legal Representation: High-Profile Counsel
Greene is represented by Joshua Michelangelo Stein, a partner at:
Boies Schiller Flexner
The firm is also known for involvement in high-profile AI copyright litigation, including cases involving authors and major tech companies.
Stein has encouraged the public to listen to the sample audio and draw their own conclusions.
The Unsettled Question: Who Owns a Voice in the Age of AI?
This lawsuit raises a fundamental legal question:
Is a person’s voice protected intellectual property?
While U.S. law recognizes “right of publicity” protections covering name and likeness, voice rights are more complex. Courts have previously ruled in favor of singers and actors whose voices were imitated in advertising.
However, AI-generated voices complicate matters because:
They may not use direct recordings
They rely on pattern-based modeling
Similarity may arise statistically
If Greene prevails, tech companies may need to:
Implement stricter training data audits
License recognizable voice characteristics
Provide transparency reports
Potential Impacts on the AI Industry
If courts find in favor of Greene, consequences could include:
Increased compliance costs
New consent frameworks
AI voice model redesigns
Mandatory disclosure of training sources
If Google wins, it could strengthen the legal defense that AI-generated outputs are independent creations rather than direct copies.
Either way, the ruling may set precedent nationwide.
Public Reaction and Industry Debate
The case has sparked online debate among:
Journalists
Podcasters
AI developers
Media ethicists
Some argue that:
AI voices inevitably resemble real people
Similarity does not equal theft
Others believe:
Consent must be required
Public trust is at stake
Media professionals deserve protection
The Road Ahead
The lawsuit is currently in early stages. The court will likely examine:
Training data documentation
Forensic methodology
Expert testimony
Consumer perception evidence
The case may take months — possibly years — to resolve.
Meanwhile, the AI industry continues expanding at breakneck speed.
Key Takeaways
David Greene alleges Google’s NotebookLM copied his voice.
Google denies using his recordings.
AI forensic analysis suggests partial similarity.
The lawsuit could influence voice rights law.
AI voice replication cases are increasing globally.
Why This Story Is Trending Now
Artificial intelligence is no longer experimental — it’s embedded in everyday tools.
As AI-generated audio becomes mainstream, disputes like this highlight a tension between:
Innovation
Personal rights
Corporate responsibility
Public trust
Whether Greene’s case succeeds or fails, it underscores a broader reality: the legal system is still catching up with artificial intelligence.