A radio host's voice is at the center of a legal battle with a tech giant. But is it a case of AI-powered theft or a mere coincidence?
David Greene, a renowned radio personality, is suing Google, claiming the company stole his voice for its AI tool, NotebookLM. Greene, known for his iconic delivery and cadence, was shocked to discover that the AI-generated podcasts sounded eerily similar to his own. But here's where it gets controversial: Google denies any connection to Greene's voice. A spokesperson stated that the male voice in NotebookLM is that of a hired professional actor, not Greene's.
The lawsuit highlights a growing concern in the tech industry: the unauthorized use of people's likenesses in AI models. This isn't the first time a tech company has faced such allegations. In 2024, OpenAI removed its AI voice, Sky, after it was compared to Scarlett Johansson's voice without her consent. The use of copyrighted materials to train AI models has sparked numerous lawsuits against major tech companies.
And this is the part most people miss: the fine line between inspiration and imitation. While AI technology can learn and replicate human voices, the question remains—where do we draw the ethical boundary? Is it fair to use someone's voice without their explicit permission?
Google's response raises more questions than it answers. Did they truly hire an actor with a similar voice, or is this a case of AI-generated mimicry? The public is left to wonder and debate. What do you think? Is this a clear-cut case of intellectual property theft, or are there nuances to consider? Share your thoughts in the comments below!