When Getty Images CEO, Craig Peters, said he didn’t want the content industry to be AI’s “batteries”, the audience gathered on a Wednesday morning in Cannes applauded.
He was, absolutely, talking to the converted – a room full of creative execs in Cannes for Mip TV’s Mip Lab/Innovation Summit earlier this month.
But his message was much wider and a whole lot broader, with a strong call to action.
“How do we engage to make this technology one that lifts up the creative segment of the creative industry, which represents about 10% of global GDP?,” he asked.
“How do we make sure that AI is added to what we do that allows us to tell better stories to reach our audiences more effectively?”
“AI will allow for better storytelling. If we use it as such. I think there are positives to come out of any technology. And there are negatives that come out of any technology. It’s in our hands to decide whether we get the positive or the negative,” he said.
Peters outlined the four elements of AI – chips/graphic processing units, computing/cloud, talent and training data.
He rephrased the ‘training data’ component as “content”.
More specifically, he said: “it’s our content. It’s our collective content. In the matrix, we are the batteries behind AI.
“The reality is the people in this room are the batteries behind AI. Right now, your content is training these systems. Has anyone got a royalty check? Checking. No? It’s a pretty quiet room.”
“Fundamentally,” Peters added, “I don’t want us to be the batteries. It’s not a very uplifting idea.” Applause.
So what are the questions we need to be asking and the conversations we need to be having?
“We need to understand the tools that we choose. We want AI to respect our IP, our collective IP. We need to ask, ‘where did that data come from that it was learned on? How did you source that? Was it ethically sourced? These are questions I don’t think are being asked too much today. But it’s growing,” Peters said.
Like other speakers during the In...
When Getty Images CEO, Craig Peters, said he didn’t want the content industry to be AI’s “batteries”, the audience gathered on a Wednesday morning in Cannes applauded.
He was, absolutely, talking to the converted – a room full of creative execs in Cannes for Mip TV’s Mip Lab/Innovation Summit earlier this month.
But his message was much wider and a whole lot broader, with a strong call to action.
“How do we engage to make this technology one that lifts up the creative segment of the creative industry, which represents about 10% of global GDP?,” he asked.
“How do we make sure that AI is added to what we do that allows us to tell better stories to reach our audiences more effectively?”
“AI will allow for better storytelling. If we use it as such. I think there are positives to come out of any technology. And there are negatives that come out of any technology. It’s in our hands to decide whether we get the positive or the negative,” he said.
Peters outlined the four elements of AI – chips/graphic processing units, computing/cloud, talent and training data.
He rephrased the ‘training data’ component as “content”.
More specifically, he said: “it’s our content. It’s our collective content. In the matrix, we are the batteries behind AI.
“The reality is the people in this room are the batteries behind AI. Right now, your content is training these systems. Has anyone got a royalty check? Checking. No? It’s a pretty quiet room.”
“Fundamentally,” Peters added, “I don’t want us to be the batteries. It’s not a very uplifting idea.” Applause.
So what are the questions we need to be asking and the conversations we need to be having?
“We need to understand the tools that we choose. We want AI to respect our IP, our collective IP. We need to ask, ‘where did that data come from that it was learned on? How did you source that? Was it ethically sourced? These are questions I don’t think are being asked too much today. But it’s growing,” Peters said.
Like other speakers during the Innovation Summit, he warned against technology for technology’s sake.
“Think about how you use AI,” Peters advised.
“Are you unlocking creativity?... Don’t just deploy technology for technology’s sake. We think it’s very useful. We’re introducing these tools… and we’re excited about what AI can bring. But I think we also have to be critical in how it’s being deployed, and be thoughtful about how we engage,” he said.
“We’re not ludites. We believe in technology. And we believe that it can enhance creativity… So we wanted to embrace it early, but we wanted to build something that was commercially responsive and socially responsive… that can’t produce socially irresponsible content,” he added.
He highlighted AI’s current high-hype cycle.
“I think everybody’s professing that they’re an expert on AI… I think anybody who professes that they are an expert on AI is an idiot.
“We are so early in this. I think every industry is struggling to figure out, is this helpful? Is it hurtful? I think they’re struggling to acknowledge their own ignorance. Or maybe they’re very knowledgeable but the level of probability of outcomes is a mess.”
“There are so many different things happening, and they all can be true. But we don’t know which one we’re living in. And we don’t know which one is going to happen.
“I think that truthfully, when you get people behind closed doors, this is really what they tell you. Everybody is trying to figure it out. It has been thrust on us and it’s been less than two years. And we’re trying to figure it out. I think that’s the reality of just about every industry, except for the ones that are on LinkedIn that are perfect.”
He described AI/tech developments as a “new set of tools” that could enhance creativity and storytelling.
“At the same time, I think it’s going to produce a lot more crap content. And I think it’s going to be incumbent upon all of us to produce content that can stand out, content that can engage an end audience. The magic technology is great, but if the content is shit, it’s still shit.
“So we’re going to have to stand out ,we’re going to have to be more thoughtful.”
Peters said a percentage of revenue earned from the content used to train the AI went back to the creator.
“That’s not just a one time payment, it’s an ongoing payment,” he promised.
His wish list in an AI environment includes transparency and accountability, including the disclosure where content has been generated using AI.
“We don’t think model developers should stand behind indemnification and lack of liability. We should be accountable for developing responsible models. And as those models go out into the universe and get used, we should have some responsibility for that. Those are the principles we’re advocating for.
“It’s not it’s not to try to put this in a box or turn back time. We think it’s about having a conversation early about what we want this to be for our society.”