CES, Consumer Electronics Show, is this huge tech convention they hold every year in Las Vegas. And of course, since tech and show business are now all the same thing (!) Variety was presenting a bunch of talks there, and invited me for one. It was a great conversation. I got to talk a bit about the movie I'm directing for Netflix, about some of the latest news stories across entertainment and AI, about what the future might or might not look like depending on how we collectively handle these next few years, and of course, more! Hope you enjoy, transcript below. 🔴 TODD: Joe, thanks for coming here today and participating.Really great to have you. So let me just start off by saying, I mean, observing that you’ve been a student of an observer and an active participant in the intersection of media and technology for more than two decades. JOE: True enough, yeah. TODD: So you’ve started this thing called HitRecord in the mid-2000s, which, by the way, just to level set, that’s when YouTube… First, you know, put up my day at the zoo. Um, so tell us just a little bit about HitRecord. What was the original vision for that? It’s now part of another company, but you know, how have things panned out on that? JOE: So HitRecord started as just this hobby that I was doing together with my brother, and it grew into an online community of people making art and media together. It eventually became this VC-backed media tech startup, and eventually we had our exit into masterclass, like you’re just talking about. And nowadays, I use the name HitRecord for the work I’m doing developing film and TV and digital content, mostly around trying to spread helpful messages about the future of technology and humanity. But I actually feel like I learned so much from all the different ways that HitRecord took shape over the years. And it’s a big part of why I’m here today, having a conversation like this. Because running HitRecord, I learned not only about the technology itself, but the business of technology and how that industry works. And a lot of the stuff that I’ve been raising my hand about with regards to AI lately, it’s not necessarily so much about the tech itself. I’m actually really optimistic and excited about the technology. But the business incentives driving some of the biggest AI companies, I think could be leading us down a pretty dark path. And if we talk about it and we understand it, I don’t think we have to go down that dark path. I think there’s still time to go down something much brighter and you know, HitRecord to me has always been kind of fundamentally optimistic. It does connect back to my brother who I started it with. If you ever got to spend time with him, he’s like a deeply positive and caring person, which is probably part of why I can’t give up the goofy name HitRecord because, you know, it always makes me think of him. TODD: Well, we talked a little bit about it, and when the internet first came out, I’m old enough to remember that, and I was like, oh, the future is so bright, everybody’s going to seek truth and goodness, and that’s not what happened. JOE: Well, yeah. I was as optimistic as anybody during the rise of social media. That’s what HitRecord came from. And look, so much good stuff happened and continues to happen on social media, but we, you know, what did happen is a few, a small handful of these kind of gigantic walled gardens came to dominate what was the Internet and their advertising business model necessitated these algorithms, these engagement optimization algorithms. And I think it’s those algorithms, frankly, that are causing so many of the damaging side effects that we’ve seen from social media, whether it’s mental health or the backsliding of democracy. And we’re about to see that all happen again with AI, but worse because there’s so much more compute power now and so much more money at stake. TODD: Yeah. So what needs to happen? I know, you know, we talked earlier, industries are regulated. There are laws. You know, you drive over a bridge, you take it for granted that. You know, somebody is going to be liable if the bridge falls down. JOE: That law was built to code as defined in the law. That’s right. TODD: And you can’t sell a car that goes a thousand miles an hour and drive it on the street. JOE: Every major industry has laws except the tech industry. TODD: Is there, I mean, what do we need here?Do we need the political will to do this? JOE: Well, just really recently, Australia, I don’t know if you know, enacted this landmark legislation banning social media apps for Australians under 16. Someone just clapped in there, yeah. And, you know, I’m a dad. I have a 10 and 8 and a 3-year-old. And we’re not anti-tech with our kids. They actually use computers a fair amount for cool things that don’t have engagement optimization algorithms built into them. TODD: But you’re in favor of the nanny state. I’m just kidding. JOE: There you go. Well, Elon, no, that’s actually not what I’m in favor of. This is the question, right? It’s like, well, and I understand why people kind of don’t trust governments to make laws that regulate technology. But look, that’s… Your bridge example is actually a really good example. TODD: Well, it was your example. JOE: Well, from earlier. They didn’t know that. But we take for granted a lot of how there’s an interplay between industry and government. And that’s how our society works. And there have to be some laws, some guardrails. TODD: Now, one of the things you’ve talked about extensively is that AI, again, it’s neither good nor evil. It just is. One of the things that needs to happen is that the creators from which it draws its source material need to be compensated. How close are we to getting that? And what are the obstacles to getting there? JOE: Yeah, it’s a really important principle to me as an artist and a creator, but I think it’s actually important far beyond the entertainment industry or art or creativity. I think for the very viability of a whole economy. The basic principle that when a person has an idea or does some work that an AI company shouldn’t be allowed to just take what they did, put it in their AI model, make money with it, and not pay the person. That doesn’t make sense. And it’s important to know a little bit about how these models are built because it’s not always obvious. In fact, the name itself, artificial intelligence, sort of implies that like, oh, oh, well, just there’s this artificial intelligence and it just does these things by itself, but it doesn’t. That’s not how the models work. The way these large language models are built is they take every book ever written, every movie ever made, every article on Wikipedia, probably every article on variety, every video on YouTube, everything that all these humans have put their, their time and energy and labor and perspective and skill and talent into, and they take it without permission. They take it without compensation. And now they’re generating trillions of dollars of economic value. And that’s not going to work. So one thing I think is really important is that all of us, whether we’re an individual creator or a company, whether it’s film and TV or music or journalism or whatever else, we all kind of stand for this principle that moving forward, AI companies are going to need to offer consent and compensation for the data and we all kind of stand for this principle that moving forward, AI companies are going to need to offer consent and compensation for the data and content they use to train their models. TODD: Well, we’ve seen, you know, so Disney, Warner Brothers, Universal, they have started a legal defense. They sued several companies. They said, hey, your thing says type in Darth Vader and it spits out Darth Vader. That’s clear copyright infringement. What are your thoughts about that? And then, so then just last month, Disney did this open AI deal. Where they’re licensing 200 plus characters to their sort of video generator company. And you’re going to be able to make Mickey Mouse say, hopefully, brand safe things. But what do you make of all that? I mean, what are the Hollywood, traditional Hollywood companies, what’s their play here? And, you know, what is the play forward? JOE: I mean, the first thing I would say is I appreciate the Disney leadership saying that they want to protect creators. We’ll see how much that wish comes true. But I would broaden it out beyond this deal between Disney and OpenAI. Like I just said a second ago, almost all of these large language models are built on mass theft. So I would hope that any deal done with one of these AI companies doesn’t forgive that past theft. Because I would imagine that probably any deal from one of these companies includes a clause. That says you release us from any claims for all of your stuff that we stole over the last number of years. That’s something I think we all kind of need to get on the same page and agree. No, let’s not forgive that past theft. Because whether it takes a year or five years or more, eventually we are going to arrive at the conclusion that this principle is important. that people deserve to be paid for their work. And at that time, we’re going to go back and we’re going to get recourse for all the stuff that was stolen. Yes, yes. About AI. About AI. Some people, when we made the announcement, were like, you’re making a movie out of AI. And I was like, no, it’s about AI. TODD: Are you using AI in any way in this production? JOE: It’s a really complicated question because AI is a poorly defined term. TODD: Yeah, does it mean spell check? JOE: Yeah, exactly. Right. So what I don’t want to do is the thing I just said, where using these tools that are built on mass theft, I don’t want to do that. TODD: So Rachel McAdams starring in this? JOE: Rachel McAdams is starring in it. TODD: Not a likeness of Rachel McAdams. |