Yesterday, social advertising giant Meta announced new updates to its AI-enabled Advantage+ advertising platform as well as continued investment in its underlying infrastructure.
Billed as “the next chapter of business performance for advertisers and agencies” ahead of Advertising Week in New York next week, the updates were AI-focused and extended from creative to customer service.
The customer service element – a customizable AI chatbot that can be hosted on Meta and/or the merchant’s website – is notable in that the company is extending its ability to inform its AIs which can ultimately be used to target ads.
Picking up the valuable breadcrumbs of customer service makes sense given the rapidly compressing purchase funnel that Meta says it sees: give the customer what they want and faster.
According to Meta, key takeaways of the new updates are as follows:
- “Launching Business AI, a turnkey sales concierge: Guide your customers from discovery to purchase with a personalized AI agent on Meta ads, messaging threads, and now on websites.”
- “New GenAI Tools For Video Optimization and Conversion: Inspire shopper confidence with more engaging content and new AI-guided experiences.”
- “Making it Easier To Partner with Creators on Facebook and Instagram: Expanded access to our creator discovery APIs simplifies the process for businesses and agencies to find creator partners.”
- “Introducing Meta AI business assistant: Optimize campaign performance and resolve account issues with a new AI chat experience in Ads Manager and Business Support Home.”
A Meta blog post has all the details. (October 2)
Q&A
Meta VP of product management Helen Ma and Clara Shih, VP of Business AI participated in a Q&A with the media after presenting the updates.
They were also joined by Meta Advantage+ Shopping campaigns (ASC) customer, Alex Stark, CMO of beauty brand, Ogee.
Questions included:
- Meta’s view on the creator ecosystem.
- Meta’s agentic solution in the marketplace.
- How Gen AI ad stack strategy has evolved.
- Is the AI Sales Concierge the “one stop shop” Meta has discussed previously?
- On Business AI & agents: Marketer use cases.
- AI dubbing and languages for video creative.
- Business AI for SMBs vs. enterprises.
- Improving the virtual “try on” experience.
(Answers are lightly edited for clarity)
What does the future of the creator ecosystem look like on Meta and how should businesses think about plugging in?
Helen Ma, Meta: The future of the Creator ecosystem is very bright. We have not one, but two, scaled apps on Facebook and Instagram that have many diverse creators of all sizes and interests in things that they do, and it’s continuing to grow.
Creators are at the center of culture and content. And, we’ve been trying to do a lot to support their growth on the platform.
When I came back from Cannes [Lions ad festival this past June], one of my big takeaways was creators are not just a bolt-on anymore for brands. They’re really a critical lever for marketing in reaching those customers where creators really resonate – and reaching those customers where the creative and the content of that creator really resonates.
From a supporting creators and creator growth standpoint, I’ll go back to some of the announcements that we made today.
We’re really figuring out how we can continue to help creators monetize and make a sustainable living.
The affiliate programs that we’re starting to test with on Facebook. We’re really excited about that, especially in supporting smaller creators as they’re up and coming and finding the right brands to partner with. And then, you know, we’re also really excited about bringing more value for creators on Instagram with testing of the links in Reels.
And then I think on creator marketing, partnership ads has been an incredible part of helping brands unleash creator marketing at scale. And so we are just going to continue to invest in making that even more powerful, bringing more features and capabilities to market, including some of the ones that that I mentioned today.
There are a lot of agentic solutions out on the market today. What’s unique about Meta’s point-of-view and offering?
Clara Shih, Meta: It goes back to some of the items that I was sharing earlier. I’d say there’s three that really make business AI unique.
- The first one is the ability to optimize for sales conversion across the funnel. Ads are part of the picture and the website is the other part of the picture. And so being able to start with a Business AI sales concierge in your Instagram Reel ad that carries through all the way to consideration and conversion – that’s a unique approach that we that has been testing well, and we’re very bullish on.
- The second is the ease of setup and ongoing management. A lot of the business customers that we’ve talked to have tried other solutions and it’s just it’s very hard and expensive to set up and to get right. And so it was a design challenge for our team to say, “How do we make this as turnkey as possible? How do we make sure that it’s self-adaptive as the business changes?” and as we see those conversations and ad campaigns and product catalogs evolve for each business as it inevitably does.
- The last area is around accessibility. And again, we’re making this free when you embed Business AI in a Meta ad, so there’s no additional cost.
And then for the web, we still haven’t announced the specific pricing yet, but our strategy for pricing is to come in at a fraction of the cost so that this can be affordable and accessible to every business globally, regardless of their business size.
Meta has been testing the generative AI solutions for a while now. What’s changed in terms of your approach and priority as what you’ve gotten market feedback, what’s working, what’s not?
Clara Shih, Meta: I will first talk about the vision that we laid out for generative AI last year.
At Advertising Week last year, I talked about the fact that our vision is really to figure out how we can apply Gen AI across the entire ad stack. We really think that there is opportunity to unlock across multiple parts here.
We started with the creative side – for anybody that’s used Gen AI, you know that content creation and creative is a big part of where it’s really interesting and easy and anyone can use.
And the other piece of why we started there is that we know Meta is a very unique and interesting platform in that we’re not just a feed – we’re not just Reels or Stories. We’re all of these things and then some which then creates this opportunity to meet people in lots of different experiences.
But then also, it makes creative more complicated to get right on our platforms.
And so creative felt like a really good starting point for us because of the success of our initial tools like image animation and video expansion. We just saw that there was a lot more opportunity for us to lean in even more on the creative side. And I think we’ve done that.
Formats is the other piece where we thought about the fact that AI can really help us to reimagine interactivity.
Why is it that how customers interact with brands should just be a tap? It can be a lot more than that. It could really bring to life the products in a way that they’ve never done before. It could help you do the things you’ve always wanted to do, but in a much more efficient and automated way.
So that’s the other piece that we’ve been leaning into – whether it’s the AI CTA stickers or the virtual try-on. I think you’ll see a lot more from us on reimagining some of those customer interactions with brands on our platforms.
Ranking obviously has been the bedrock of where performance has come from and we’re doing a lot here to apply Gen AI already. The newest foray here is on the business tool side – whether it’s the Business AIs or it’s the Meta Assistant that I talked about today.
I think the journey has been that we are unlocking performance in every part of the ad stack. As we test and learn to see what’s working, we build on top.
On the creative side, with the popularity of the image animation, we said, “Hey, let’s move to multi-scene video generation.” That’s the next step with the text translations. We were offering that for just the ad copies. We said, “Let’s move to AI dubbing.” And so a lot of this is a build on what’s working so that we can amplify and make it even easier than ever for advertisers and brands.
[Bloomberg has] reported that Meta is testing tools that will let advertisers create marketing material and creating prompts and messaging using generative AI prompts. How is that work evolving? Is this turnkey AI sales concierge the “one stop shop” we’ve been told is coming, or is this more so about improving personalization?
Clara Shih, Meta: So there’s a lot that we’re investing in when it comes to AI tools for advertisers and for other teams within businesses that aren’t just focused on advertising.
And so what Helen walked us through earlier today was the ad specific AI tools. And then Business AI, you can think of as spanning across and being that sales concierge that connects with ads but also into messaging, organic conversations, as well as the the merchant website.
Right now, we have a lot of these investments. And I think the question behind the question is, will these different AI tools come together?
Because I also lead the Gen AI platform for monetization from an infrastructure and architecture standpoint, all of these use cases run on the same orchestration, model evaluation, retrieval stack. We’ve architected it so that we have scalability and modularity there.
In terms of the business user front end, right now they’re separate because we want to go fast. But there are all kinds of ways that we’ll start to see these different tools come together over time.
A great example of that is… I’ve been partnering with Helen’s team on how do we allow the Business AI in the voice Reel invoke virtual try-on? That’s an example of a creator AI feature that you could start to engage with on the Business AI side as a consumer.
And so we don’t have that yet, but that’s the type of conversations that we’re having. So, please stay tuned.
For Alex at Ogee – looking ahead, how are you thinking about Business AI agents changing the way you’re thinking about the holiday season – especially thinking about things like seasonal promotions, etc. How do you think about agents as part of your toolkit?
Alex Stark, CMO, Ogee: The personalization is key. And I think that especially during the holidays, I saw some stat recently about how many ads people see in a day, and I won’t say the number, because I’ll get it wrong, but it’s an insane amount, and we have to stand out and cut through the noise.
I think for the holidays, especially, I think it’s going to be an incredible tool for us. People can come to our site and ask, “What should I get my family member for this holiday?” Or on the ad side, they can ask that as well. And I think to really help guide people at an overwhelming time into what to what to purchase is going to be a huge tool for us.
Around the AI dubbing features, can you share more about what languages are available?
Helen Ma, Meta: Right now it is available for Spanish, but we are definitely looking at additional languages to expand to. On the text translation part of it, it is available in the top 10 most spoken languages, if I remember correctly. And that’s also being expanded.
In your overview of Business AI, you mentioned a lot about this being for SMBs. Is it also for enterprises as well?
Clara Shih, Meta: Yes, it’s all US eligible businesses. Businesses can go into our Meta business suite and they’ll see the button as they become eligible. And we’re building to more requirements every month and every week. And so we do see some enterprise customers using it.
This is early days, so there’s a laundry list of a feature requests that we’ve gotten from from businesses of different sizes, especially the bigger ones. And we’re in the road-mapping process for next year, and so I would expect us to be able to serve a greater number of enterprise customers over time.
Can you explain how try-on-me is different from the virtual try-on experience we announced earlier this year?
Clara Shih, Meta: So when we first started, we thought about it as, “How do we think about what is holding people back or making it hard for people to figure out whether they have confidence to purchase?” One of the big things that we saw was – twofold.
It was prohibitively expensive for a lot of retailers to shoot their products on a model. Every single you can imagine how many different SKUs that are out there. And so a lot of things are actually shot just on a plain background. And so the idea was, if we put then the clothes on a model, that would give people more confidence.
The other piece was — I don’t know about other people — but when I see ads featuring a model that looks pretty different to me – that was the other thought process.
Maybe you see it on this gorgeous 5’9″ white lady, and you say, “Well, I don’t look like that. So I actually don’t know if that piece of clothing will look that way on me.” And so the original starting point was, “Let’s give people options around virtual AI models that are of all skin tones, shapes, sizes. You can choose looks more like you or the person that you’re purchasing for.
And then you can, sort of, try it on. I think the next evolution of that is what we’re testing today: the try-on-me. And so I think the point is can we go further than that and really help you to imagine it on you?
That’s the experience that we’re going to test today, which is… if you give us permission to then upload a photo that you’re confident in, we can then put the clothing directly on that photo and give you a much better sense of what it will look like if you purchased it and you were able to wear it right away.

