As agencies evolve AI tools for influencer vetting, they’re also discovering the tech’s limitations

In the last year, influencer agencies have adopted generative AI applications to reduce the time it takes to organize creator participation in brand campaigns. Client reactions to these solutions have been mixed

. In recent months, agencies working in this area have found a clear application of AI tools in brand safety.

Creator screening can take “a few days or a couple weeks, depending on how deep the analysis is,” said James Clarke. AI solutions are designed to reduce this time to just a few minutes.

While the setups vary between agencies, most are combining AI-assisted searches with generative AI apps that analyze social posts. The systems are used by brands to identify creators who are a good match for their brand, and to disqualify those whose work does not meet the client’s guidelines. In recent years, marketers have become more cautious and sensitive when choosing influencer partnerships. According to agency executives guardrails could include depictions of drinking, swearing, or political speech. Aynsley Moffitt, director of growth and product at Open Influence said that the “majority of clients” limit vetting to six months worth of social media activity. However, some clients asked for the company’s assistance to preview content from up to five years ago. Moffitt stated that “this comprehensive approach helps protect brands while identifying the creators who will be genuine advocates for our clients.”

Ben Jeffries, CEO of Influencer and co-founder, said: “It is a first line defence.”

With the time saved, an agency could take on more briefs or manage a greater number of creator relationships – allowing them to service bigger campaigns. Mae Karwowski is the founder of WPP’s creator agency Obviously. She told Digiday that “it’s a scale and efficiency play.” The more time we save on the “first pass” of content or the review of creators profiles, the more we can spend on optimization and strategy. Viral Nation, a company that offers a similar solution since 2023 is not the only one responding.

Lightricks is a software publisher that produces Facetune and is currently developing SafeCollab, an AI-based creator screening tool. The company began an open beta phase of development in November, after working with six brands advertisers, including PepsiCo through 2024.

According to Corbett Drummey of Lightricks, vp for brand partnerships, the software scans and analyzes an influencer’s activities using both public social posts as well as access granted by creators who use Lightricks Popular Pays influencer platforms. Next, X and YouTube; Instagram and TikTok were tested. He explained that the software “basically does a cursory Google internet research for them, summarizes this, and then summarizes the [social] contents that it has indexed.”

PepsiCo’s Clarke said that the beta test with SafeCollab is one of many ongoing trials using AI to market creators.

He said, “By leveraging new technologies… we’re confident that our teams will move faster, operate efficiently, and increase effectiveness of creator vetting.” Clarke refused to reveal the “red-lines” PepsiCo uses to disqualify content creators. Drummey, who declined to give specific financial figures, said that each background report generated by the software costs several hundreds of dollars. Lightricks’ software is designed as a self service application for in-house teams. However, agencies like Influencer, Obviously, and Open Influence offer similar solutions to select creators.

Creator agency Props offers an API-based solution that uses Google’s Gemini Model, while Influencer has been using a proprietary software based on API access to ChatGPT since October to service its entire customer roster.

Before that, Influencer’s employees would have to manually go through a creator’s feeds to approve them. Jeffries, Influencer’s Jeffries, said that the AI solution helped to speed up the work in clearing influencers for partnership, monitoring campaign outcomes, and informing the agency’s report on the post-campaign aftereffects.

Automating the creator vetting process has more implications than simply saving time. The tendency of large language models to hallucinate responses or to reproduce the biases in their original inputs can make decision-making difficult. Megan Matera is the director of client satisfaction at Props, a creator marketing agency. She said that Ollie, their solution for analyzing images, sometimes misinterprets what it’s being asked to do. She gave an example where a creator posted a picture of someone drinking beer in a spa. Matera said that Ollie had flagged the image as “they were bathing with beer”.

Corbett said that his team is adding custom filters to SafeCollab’s software because its default settings flagged a large number of posts as potentially risky. He said clients had complained that the results were “too alarmist,” and that “we are going to have change the way we display this stuff.” The systems instead flag problematic posts for a human staffer.

Karwowski, from Obviously, said: “We always have a human team review any work done via AI.” Moffitt, from Open Influence, said that having experienced team members review any insights generated allows us to provide real-time feedback if something is off.

There’s still an uncomfortable parallel between AI-assisted selection of influencers and the overuse of programmatic safety filters by brand marketers, which has been credited for effectively defunding media organizations.

As influencer marketing becomes more “programmatic,” both in terms of idea and execution, automating the task for creator vetting could risk a repeat of this harm. It might defund creators whose activities are deemed outside the acceptable range without their knowledge. Jeffries suggested that human involvement in AI-powered selection processes would always be required.

He concluded that “Influence Marketing is not just a medium buy, but also a creative purchase.”

https://digiday.com/?p=565343

Read More

More from this stream

Recomended


Notice: ob_end_flush(): Failed to send buffer of zlib output compression (0) in /home2/mflzrxmy/public_html/website_18d00083/wp-includes/functions.php on line 5464