Note from David Oxenford: Seth Resler of Jacobs Media yesterday wrote on his Connecting the Dots blog about the ease of synthesizing the voice of a celebrity, and the temptation to use that replicated voice in an on-air broadcast.  Last week, in an article on policy issues raised by AI, we mentioned that some states have adopted laws that limit the use of synthesized media in political advertising.  In Seth’s article, he quotes Belinda Scrimenti of my law firm pointing out some of the legal issues that arise from using a synthesized voice even in entertainment programming, and especially in commercials. Belinda has expanded on her thoughts and offers the following observations on the use of synthesized personalities on radio or TV. 

The advent of artificial intelligence poses interesting and often challenging legal issues because the law is still “catching up” with the technology. Consider the impact of new AI platforms that can learn a person’s voice, then speak whatever text you submit to it in that person’s voice. If a user submits 60 seconds of Taylor Swift audio to the AI platform, the platform can use this sample to learn to “speak” as Taylor Swift, and the user can then have “her” say whatever the user wants.

While some states are considering or have adopted some restrictions on impersonation by AI, many existing legal concepts applied with traditional celebrity impersonation claims are already applicable to this kind of synthesized celebrity impersonation. Thus, if the use by a broadcaster of Taylor Swift’s voice (either taped and edited or impersonated by a human) would violate the right of publicity that is already found in the law of most states, the use of her AI voice would also violate these same rights.  

The right of publicity is a right that is based on state laws.  Elvis Presley’s estate was one of the forerunners in advancing legislation to protect publicity rights in Tennessee, but laws now exist in most states that protect the use of living individuals’ name, image, likenesses, and other identifying features, which includes the voice.  Because the states’ laws vary considerably, the risk of a claim against a broadcaster will depend, in part, on where the broadcaster’s signal is heard.

The test for infringement of the right of publicity requires no proof of falsity, confusion, or deception, but rather is governed by the test of “identifiability” of the person. So, in the case of an AI Taylor Swift voice, if the voice is identifiable as Ms. Swift, the use may constitute a violation, regardless of whether the broadcaster explicitly identifies the “voice” as that of Taylor Swift.

Like most celebrities, Ms. Swift might also have claims for copyright violations – for example, if a series of words or phrases were directly used and are protected by her copyrights – such as a significant portion of her song lyrics or a frequently-use catch phrase.  She could also allege trademark or unfair competition violations, for false endorsement or false association with her. Finally, many celebrities also have trademarks registered on or associated with their names.

Of course, like the traditional law governing the right of publicity, there are exceptions for First Amendment protection for certain types of non-commercial speech. The grounds for an exception can vary, but they generally cover a de minimis use that is commentary, criticism, or parody.  Hence, in the Taylor Swift scenario, if a radio DJ conducted an “obviously over-the-top” interview with AI Taylor Swift that was so clearly humorous, fake, and parodied her, just like would be the case of a human impersonator, the broadcaster might be able to make an argument to defend the use in the nature of the “fair use” right in copyright law.  But “fair use” is a concept that is tricky to apply, as what is found to be “fair” can vary from court to court.  In copyright, just because the bit in which the voice is used is funny, does not mean that it necessarily is fair use, as that concept usually requires that the bit be making fun of the otherwise protected work itself, not that the copyrighted material is just used for the sake of comedy (see, for instance, our article here explaining the difference).  Similar issues may apply here.  While a broadcaster never wants to hear this, if you are considering such a use, consult your attorneys. 

Other uses of a celebrity’s voice would be near-certain violations of the right of publicity and/or false endorsement.  Uses that imply that the celebrity is speaking, particularly where endorsing a product or service, are almost always going to raise significant legal risks.  Here are examples of uses likely to raise issues:

  • A radio station creates a recorded promo for a contest using the celebrity’s voice. (Example: “This is Taylor Swift, and all this week, you can win tickets to my concert at the ACME Pavilion by listening to WKRP.”)
  • A radio station produces a commercial using the voice of a celebrity. (Example: “This is Snoop Dogg, and whenever I’m at a cookout, for shizzle I drizzle some Jack’s BBQ sauce on my burgers. Right Martha?” “That’s right, Snoop.”)

Even if the celebrity is not explicitly identified in the spot, the use of their recognizable voice without permission may well give rise to a legal claim.

One final note – a murkier question arises as to whether the AI voice of a dead celebrity is entitled to the same legal protections.  The short answer is – it depends. State laws vary as to whether any protection is granted to celebrities after their death, and if so, the length of that protection.  Even if used in a state where there is no protection for a deceased celebrity, online transmission of the bit may subject you to the law of states where such uses are protected.  But even where no right of publicity for the voice exists or has expired, the estates of many famous celebrities have used trademark and other means to provide other forms of protection.  Again, the Elvis Presley estate has actively pursued claims, as have those of James Dean and Marilyn Monroe, among others.

In sum, as a broadcaster, you should not treat AI-created celebrity “voices” any differently than you would if a human impersonator had been doing the speaking, and in fact in some cases you may need to be more careful.  If you are doing a TV show and you can see a live impersonator is doing the voice of a celebrity, there is likely much less possible confusion than if a synthesized version of the actual celebrity is doing the talking.  Absent clear parody entitled to some limited First Amendment protection, the use of the AI celebrity voice can subject the broadcaster to infringement claims for violations of the celebrity’s rights.  So, just because AI gives you the ability to make use of synthesized celebrities to enhance your programming does not mean that you should do so.  As always, think before you act and talk to your attorneys about the issues that may be raised by any on-air use you may be considering.