Newsletter
AI the LAW & YOU Newsletter

Release Date:

Friday, February 23rd, 2024

Episode Transcription:

[00:00:00] Mark Miller: You are listening to “AI the Law & You”, a show where a lawyer, a technologist, and a layman discuss a recent legal case against an AI company. These are not scripted. What you’ll hear is a real conversation as we talk, argue, and cajole each other to think deeper about the legal aspects of using AI and what people should be concerned about when using one of the platforms.

In today’s episode, we’ve got a very interesting case featuring a deep fake of a George Carlin comedy routine. We’re also adding a new host, the voice of AILAH, an AI engine who will be giving us real time responses to questions within the discussion. Shannon and I were intrigued by some of the ideas AILAH came up with, which moved our thoughts in a direction different from where we originally intended. Adding AILAH to the mix turned this into a very interesting discussion.

Today’s court is now in session.

We’ll start today’s show with a summary of the case. AILAH, give us a quick overview of the George Carlin case filing.

[00:01:13] AILAH: In the George Carlin deep fake case, a video surfaced online purportedly showing the late comedian George Carlin delivering a monologue about the COVID-19 pandemic. However it was later discovered that the video was a deep fake, a digitally altered video created using artificial intelligence to manipulate Carlin’s likeness onto another person’s performance.

This incident highlighted concerns about the rise of deep fake technology and its potential to spread misinformation and deceive viewers. It also underscored the importance of verifying the authenticity of digital content and raising awareness about the risks associated with deep fakes in the digital age.

[00:01:54] Mark Miller: Based upon what AILAH just said, what’s the real controversy, Shannon, about George Carlin versus Dudesy in the lawsuit ?

[00:02:06] Shannon Lietz: Yeah, this is kind of an interesting case if you ask me, and the reason why is ’cause it only comes to our landscape due to AI. It’s the use of AI that’s most interesting and provocative.

The real controversy here is these folks are trying to build a lawsuit and have filed one. A similar context would be an impressionist, a human impressionist. What they’re doing in terms of reproducing Carlin’s works. Now that he’s dead and gone, I think that the estate is trying to protect their rights, if you will, when it comes to George Carlin.

And so this goes down that rabbit hole of the inventors of Dudesy, essentially took all of George Carlin’s works, shoved them into AI, and I think that’s why we’re talking about it is well, can you do that from a technical perspective? Lots of people are doing things like taking books and pushing ’em into AI engines and taking, different types of videos, turning them into embeddings, putting those into their products.

I actually think this is the case, honestly, that breaks the internet if it gets decided within a courtroom. And the reason why I say that is because I actually think there’s a whole lot here.

[00:03:28] Mark Miller: There’s a couple tangents that we could pull on here. One is that based upon our research is that the different jurisdictions, there’s got to be hundreds if not thousands of jurisdictions. There’s nothing that overall encompasses what’s going on here.

[00:03:48] Shannon Lietz: Yeah, that’s a good point. When you really look in and dig into the meat and potatoes on this case. It’s quite interesting and I think this is where we’ve gotta learn a little bit about what are the, what is a dead person’s rights, what’s their estates rights when it comes to protecting things like this?

But for me, what I really lean on is, I question whether this is a copyright infringement case. I am trying to understand where this is coming from. In some ways this is sort of to the point of the authors of Dudesy sort of brought back and resurrected George Carlin. Isn’t that kind of good for this estate? I have that in my mind.

But ultimately, I guess on this case in particular, the controversy I see is basically just that the use of AI is questionable. People are frustrated by it, and it was a good way to get this concept brought forward in the law.

I think jurisdiction is absolutely a concern. If we were to look deeper into protectionist rights. I don’t know much about those, I am no lawyer. But I will say, I’m deeply curious because as a creator, as an author, I’m always trying to understand what do I need to do, and I have a whole lot of questions around what do you actually need to do to protect your voice, your authored works, your creativity when we are, when we’re thinking about AI in general.

[00:05:17] Mark Miller: The thing that you and I have talked about too is that we’re not seeing the idea of impressionism coming up in these discussions at all. Because if I was the defense and I’m going, “We are just creating an impressionist, just like Rich Little,” who would take the voice of any celebrity and put words into that celebrity’s mouth and call it comedy.

That seems like a defense to me.

[00:05:50] Shannon Lietz: I don’t know enough about what has to be claimed from a defensive perspective on this. I know a few things and we’ve talked to Joel a lot about this; copyright infringement and the notion of fair use and transformative work.

The question I have is, were there any George Carlin jokes actually repeated in the Dudesy videos? If there were, then potentially the prosecution has a case, and by the way, I kind of really want the prosecution to have a case in this, but I also know that the likelihood of that truly working out is probably low because on the defense’s side, unless somebody’s really copying, again, copyright protection, unless somebody’s really copying George Carlin as part of the work he put out, we’re talking about something entirely different, which is where I think the controversy is in the first place.

[00:06:46] Mark Miller: Let’s make a clarification between this case and the Sarah Silverman case. And I don’t want to go down the hole of Sarah Silverman’s, but Sarah Silverman says they copied directly from her content. That the engine itself pulled directly. That’s completely different than what we’re talking about here.

I just wanted to make that clarification.

[00:07:10] Shannon Lietz: I think so. Again, I haven’t sat down with the Dudesy information. I’m not sure There’s been transcripts pulled from it, but a transcript, if he took a transcript and basically compared it to another transcript from George Carlin, would you find any of those overlaps?

And then the question would be how much copying does it take to get a copyright infringement suit to actually stick? And I don’t have those answers.

[00:07:38] Mark Miller: Let’s talk to AILAH. When you’re looking at the George Carlin deep fake lawsuit, what do you see as the main controversy in that lawsuit?

[00:07:50] AILAH: The main controversy in the George Carlin deep fake lawsuit likely revolves around the balance between freedom of expression and the protection of individuals’ post-mortem rights. On one hand, there’s the argument for preserving the integrity of Carlin’s legacy and preventing unauthorized use of his likeness for potentially harmful or deceptive purposes.

On the other hand, there’s the consideration of free speech and artistic expression, as well as the evolving nature of technology and its implications for legal frameworks surrounding identity rights and digital manipulation. Resolving this controversy involves navigating complex legal and ethical questions about the intersection of technology, privacy , and intellectual property rights.

[00:08:37] Shannon Lietz: Well, AILAH brings up a really good point in that summary of the controversy. I suspect that this notion of post-mortem rights or name-image-likeness protection of a dead person is something we’d love to have more understanding of. Because when you think about AI being used on a dead person’s works, is that going to be protected in perpetuity? Is an estate going to continue to have those protections? I’d love to know a little bit more there.

[00:09:06] Mark Miller: You’re saying on a dead person’s work. It’s interesting because if you take this back historically, let’s say that you talk about Monet and his paintings. If I as a painter go and paint in the style of Monet, that cannot be protected as long as I don’t pretend it’s his work.

[00:09:28] Shannon Lietz: That’s an interesting thought process and, and this is actually where I think some of these cases are really intriguing because what we’re really saying is that people’s works are being reverse engineered to pull the algorithm forward of their style, potentially even remastering their voices.

And when I did some digging, and again, not a lawyer, some of the things I kind of fell onto was is this case actually got the right claims in it? What about trade secret protection? What about patenting? Do creators now need to patent their work as an algorithm?

I’m really curious about where this is gonna go, because I mean, you’re effectively saying that a machine with the ability to ingest enough information can now create a statistical algorithm of any person, their creative works, their voice, their thinking, reproduce it for somebody else who can fake it. Like how will fake start to turn out. I, I definitely think this has something to do with, um, content credentials and this notion of authenticity to your point.

I’d love to learn a little bit more about what is behind this, this concept of post-mortem rights, because AILAH did bring that up, and I think we should learn a little bit more there.

[00:10:52] Mark Miller: The interesting thing that you just brought up for me is creating a statistical image, if I can use that word of a specific person. That’s what Amazon, Netflix, and Google already do with the content that it’s ingesting about you as a person.

Based upon what we’ve been talking about. Can someone protect the name, image, and likeness of a dead person or post-mortem?

[00:11:22] AILAH: Yes. In many jurisdictions, laws exist to protect the name, image, and likeness of a deceased individual, commonly referred to as post-mortem rights. These laws vary by region, but generally aim to safeguard the reputation, legacy, and privacy interests of the deceased person and their heirs.

Such protections can include the right to control the commercial use of the deceased individual’s likeness, prevent unauthorized endorsements or associations, and preserve the integrity of their identity.

However, the extent and specifics of post-mortem rights can differ depending on the legal framework in place.

[00:11:57] Mark Miller: How does that apply to the George Carlin case?

[00:12:00] AILAH: In the George Carlin case post-mortem rights could potentially apply if the deep fake videos using Carlin’s likeness are deemed to violate his post-mortem rights. These rights may encompass protections against unauthorized use of Carlin’s image or likeness for commercial purposes, or in a manner that could harm his reputation or legacy.

If the plaintiffs in the case can demonstrate that the deep fake videos infringe upon Carlin’s post-mortem rights, they may have legal grounds for pursuing remedies such as damages or injunctions to stop the dissemination of the videos. However, the specific application of post-mortem rights in this case would depend on factors such as the jurisdiction, where the lawsuit is filed, and the interpretation of relevant laws by the court.

[00:12:45] Mark Miller: The interesting comment there, and it’s been consistent, is that, is this being used for commercial purposes?

[00:12:56] Shannon Lietz: Yeah, absolutely.

[00:12:58] AILAH: That’s a crucial point to consider In the George Carlin case. The determination of whether the deep fake videos are being used for commercial purposes could significantly impact the legal analysis. If the videos are used for commercial gain, such as for advertising, endorsements or other profitable ventures, it could strengthen the argument for infringement of Carlin’s post-mortem rights.

However, even if the videos are not directly used for commercial purposes, other factors such as the potential harm to Carlin’s reputation or the deceptive nature of the DeepFakes could still be relevant in assessing the legality of their creation and distribution.

[00:13:37] Shannon Lietz: AILAH brings up a really good concept here of post-mortem rights. One thing I’m noticing though, in the way that she’s talking about it or they are talking about it is that it’s about copying from Carlin. And I’m not certain that AILAH actually understands what we’re talking about could potentially even be transformative use.

So I’m curious if someone were to do transformative use of a deceased person’s information, all of their created work and leverage that, what would that do?

[00:14:17] Mark Miller: It kind of leads us to the question of who has the most likely chance of winning and why? And I’ll start this off to say, based upon our discussions in our research, I think the defense has a really good case.

This is transformative work. It’s in the style of, it’s not being used commercially to sell anything, although there is merch, but is the voice itself trying to sell the merch. Anyway, I think that the defense has a decent case.

[00:14:54] Shannon Lietz: Yeah, I agree with you and I would even go further. It makes me really sad that the defense has a pretty good case because I don’t think that creators really anticipated this future use of their materials, in which case they might have actually have tried to protect them away from AI back when.

So like at this point in time, if I’m a creator, which I am, I’m starting to think about how will I, how might I protect, the information or the creations that I’m pushing out to the public audience. If you look up George Carlin on the internet, there’s, you know, thousands of George Carlin snippets that right now aren’t necessarily being protected against copying and infringing use directly that I can see from just scouring. The defense has a pretty decent case on their own.

I guess the question on the other end of this, to your point there, is merch involved that does weaken the defense’s case, Hey, you know, we’re not just doing this for giggles, we’re doing this for profit. I, I guess on the, the side of the, you know, prosecution who’s really trying to protect the original author, the or original creator, the human in this case, um, I’m really curious about what somebody might have to do if they’re the human creator going forward based on this case.

So this one, one is on my watch list of how will this turn out. Like I said, I think that unless it settles and we do see some sort of opinion come out of it, it’s going to set precedent for lots of future potential issues .

[00:16:34] Mark Miller: Let’s leave the final thought to AILAH. AILAH, in the current situation, how can creators protect themselves from unfair use?

[00:16:48] AILAH: Creators can take several steps to protect themselves especially in cases similar to the George Carlin Deepfake situation.

1. Copyright protection. Creators should ensure they have proper copyright protections in place for their original works, including videos, images, and audio recordings. This can help prevent unauthorized use or manipulation of their content by others.

2. Watermarking and attribution. Adding visible or invisible watermarks to content can help identify the original creator and deter unauthorized use. Additionally, requiring attribution for the use of content can help ensure that creators receive proper credit for their work.

3. Terms of use and licensing agreements. Creators can establish clear terms of use for their content and require licensing agreements for its use. These agreements can outline how the content may be used, whether for commercial or non-commercial purposes, and specify any limitations or restrictions .

4.. Monitoring and enforcement regularly monitoring the use of their content online can help creators identify instances of unfair use or infringement. When unauthorized use is detected, creators can take appropriate enforcement actions such as issuing takedown notices or pursuing legal remedies.

5. Public awareness and education. Educating the public about the ethical and legal implications of using and sharing content without permission can help deter unfair use. Creators can also advocate for stronger legal protections and enforcement measures to combat infringement.

By proactively implementing these strategies, creators can help safeguard their intellectual property rights and minimize the risk of unfair use or exploitation of their work.

[00:18:35] Mark Miller: That’s all good for future works, but how will people protect previous works?

[00:18:42] AILAH: Protecting previous works can be more challenging, but there are still steps creators can take to address potential instances of unfair use.

1. Copyright registration. If the works are eligible for copyright protection and have not already been registered, creators can consider registering them with the appropriate copyright office. While copyright registration is not required for protection, it can provide additional legal benefits and remedies in case of infringement.

2. Monitoring and enforcement. Creators can actively monitor the use of their previous works online using automated tools, search engines, and social media platforms. When unauthorized use is detected, they can take appropriate enforcement actions such as issuing takedown notices or pursuing legal remedies.

3. Retroactive licensing. Creators may explore the possibility of retroactively licensing their previous works to individuals or entities that have used them without permission. This could involve negotiating licensing agreements and seeking compensation for past use.

4. Public awareness campaigns. Creators can raise awareness about the importance of respecting intellectual property rights and the ethical considerations involved in using and sharing content without permission. By educating the public about these issues, creators may help deter future instances of unfair use.

5. Legal remedies. In cases of egregious infringement or damage to the creator’s reputation, pursuing legal action may be necessary to protect their rights and seek appropriate remedies.

Consulting with a qualified intellectual property attorney can help creators assess their options and determine the best course of action. While protecting previous works may present challenges, creators can still take proactive steps to assert their rights and address instances of unfair use to the extent possible.

[00:20:37] Mark Miller: Your eyebrows and mine went up when she said one of those things. What was that?

[00:20:43] Shannon Lietz: Retroactive licensing?

[00:20:45] Mark Miller: Yeah. That’s something that I have not seen before.

[00:20:50] Shannon Lietz: That definitely hasn’t been something we’ve seen.

I’m curious about all these things and definitely listening to an AI do some legal analysis is interesting. We’ve got to be, again, careful, like anybody listening, you know, absolutely not legal advice, but provocative, nonetheless.

[00:21:07] Mark Miller: The interesting thing for me is the unreasonableness of some of these situations. For me to monitor every social network for my personal work is, it’s not going to happen. I mean, question. There might be companies that get set up to say, we can be your enforcement for a fee. We will be your enforcement monitor.

[00:21:33] Shannon Lietz: Yeah. And I guess on my end of thinking through this, it means that we’re going to see more internet agencies pop up, really. Folks that will protect the creator. I think that’s gonna be where it heads.

I think the use of content credentials will easily get us there. And this is where I’m going to start to sound, as naive as I have been when we entered into this whole ai, the law in you, which is, um, I think you were right when you started bringing up the music industry and some of our conversations and, and just exactly where this is potentially going to go.

I’m curious about how it’s all gonna turn out

I think we’re going down this path of actually where my trade craft of cybersecurity could become even more interesting and more useful in the future.

Specifically where it intersects with brand, AI, data, and some of these, interesting questions we all have.

[00:22:30] Mark Miller: That adjourns our session for today. If you enjoyed the conversation, you can pay for our services by subscribing to “AI the Law and You” on your favorite podcast platform. Thank you to AILAH for joining us today. You will be hearing more from her in future episodes.

“AI the Law and You” is a Sourced Network Production.

Episode Guest:

SUBSCRIBE