When developing a generative AI strategy, media agencies consider not just content generation, but also what the technology means for their overall data strategy.
At the Digiday Media Buying Summit in Nashville this week, we talked about how media agencies are leveraging AI in practical ways such as optimizing existing media. Speaking on stage at the summit, iProspect Chief Growth Officer Amanda Moore said the Dentsu-owned agency uses public information such as weather data. This allows your messages to be more personalized without using any personally identifying information.
“We really see this year as the year where we see more practical AI than just generative AI,” Moore said. “It can produce something, but there's a lot of bias there, and you can make some mistakes… This year, we're really trying to figure out what it means to actually act. You're going to start to see very concrete practical examples of testing and learning data.”
One of the first things IPG did was develop a platform that gave the agency's 13,000 employees access to a custom version of ChatGPT. This allows employees to understand the generated AI even before they guide clients.
According to Graham Wilkinson, chief innovation officer at Kinesso, an IPG company, organizing your brand documentation is a critical step in developing an AI strategy. He said that before thinking about AI outcomes, many brands need to make sure their brand strategy is consistent and their data architecture is in place. Wilkinson said it's important for companies to understand their tone, brand hierarchy, target audience and brand safety standards to create the right framework from which to develop an AI strategy.
“It’s like the 23andme of brands,” he said. “How do I take a blood sample and tell them exactly what my brand is? Once that's done, we can start thinking about the technical aspects of emulating that.”
Media agencies are also testing how generative AI can help them understand their audiences. For example, one user built a tool that ingested first-party data to build an audience. This allows you to ask different questions to your synthetic audience. But developing your own AI tools in-house is resource-intensive, and at one institution it took him three months and dozens of engineers to build his own privacy-compliant AI environment. Some say it is necessary.
Other agency executives say generative AI applications for audio ads can also reduce the credibility of content. Maria Tullin, senior vice president and managing director of performance audio and podcast strategy at Horizon Media, said that while AI-generated audio in ads may work for general voice-overs, it doesn't sound like a real person. I mentioned that it doesn't work for read-aloud hosts. He also pointed out that there should be different price points for human voice and AI-generated voice.
“I hate the idea of the voice of the host being AI, because the whole point of podcasting is authenticity and tapping into an audience that loves this host,” Taryn said. “I don't like it opening the door for people to support things without really meaning it.”
Avoiding AI-generated voices is not something everyone does. Some people say it's effective. A recent study by Veritonic found that Intel's general AI audio ads increased brand favorability by 10%, while personalized AI audio ads increased brand favorability by as much as 22 points. According to Veritonic, AI-powered audio ad personalization also led to increased purchase intent. The ad was developed in collaboration with Instreamatic and purchased through a campaign run by Dentsu.
Government agencies still recognize AI issues around diversity and representation. Alvin Gray, chief strategy officer at Response Media, said the company has been experimenting with various AI photo apps, but they are still “completely inadequate” when it comes to accurately portraying people. Mr. Gray, who is black, recalled that when he uploaded a photo of himself to one of the apps, the AI's photo said he “looked like an Indian man.” When he tried another app, the AI-generated image made him look like a Polynesian.
“There's confusion in the data,” Gray said. “So from a representation standpoint, it's really important to make sure the people writing the code understand the data they're capturing.”
Albert Thompson, managing director of digital innovation at Walton Isaacson, said generative AI is not a method marketers use as a “cultural cheat code.” Like Gray, he also pointed out the inadequacies and inaccuracies of the multicultural datasets used in his AI models.
There is also still a need to improve algorithmic bias around programmatic advertising, said Sherine Patrick, lead media strategy advisor at Ops Shop. She gave the example of a programmatic ad block list that includes the word “bomb,” which is also used as a culturally positive word.
“Actually, culturally, we use this word to describe something good,” she says. “Like my bombshell hair. You know what? So if it's on a page, that publisher is blocked and can't automatically get funded.”