
Hi and welcome back to another edition of The Keyword. This week we’re diving into research from the Meta AI team which may become an exciting future product update. But today it leaves some clues on how to run better ads!
— Kole
P.S This week my startup, Wilow, became a Meta-approved Tech Partner. The software is a closed beta for now… but if you would like to be first in queue for the next group of users you can join the waitlist!
Feature

Meta's AI research lab just published a model, TRIBE V2, that predicts how the human brain reacts to video, audio, and text.
It's a neuroscience paper based on fMRI scans. But the findings might help with your next creative brief:
Combine video, sound, and text into each ad because the brain's reaction is 50% stronger when all 3 are present compared to just using one.
Include text overlays because the brain relies on reading to process the logical reasons to complete a purchase.
Place your most exciting visual hook as early as possible because there is a ~5-second delay between a person seeing something and their brain reaching its peak response.
Your video, audio, and text should play different roles. Each should provide complementary but different information. Meaning a talking head with captions doesn’t stimulate the brain to the maximum.
All these might be intuitive. But now you have some science to back up your creative requests!
This is an early model. The key developments to watch are whether Meta pairs TRIBE V2 with GEM, the model that decides which ads get shown to which users, or provides the ability to get feedback from TRIBE before launching creative.
Sponsored by Granola
AI Meeting Notes (without the annoying meeting bots)
↳ Granola transcribes your computer's audio directly, without meeting bots joining your call.
More Headlines
🛒 Meta partners with Stripe to enable checkout on Facebook
Advertisers will soon have the option sell directly on Facebook without redirecting shoppers to their website . Stripe powers the payments in the background using its Agentic Commerce Protocol while Meta integrates the checkout.
💡 Meta Updates Instagram with Adaptive Ranking Model
The new system uses advanced processing to track more engagement signals for each user in real time. The change aims to make the ads users see better match what they are interested in.
💰 ChatGPT Advertisers have spent $100M in Six Weeks
Following the initial launch, OpenAI has lowered the minimum spend to $50,000 (from $200,000) and has brought on a creative partner, Smartly, to focus on how the ads appear to users. Advertisers can register interest for ChatGPT ads here.
📦 Reddit Debuts Collection Ads x Shopify Integration
Collection Ads show a lifestyle image with shoppable product tiles pulled from an advertiser's catalog into a full-page Reddit ad unit.
💻 Shopify Opens Its AI Shopping Network to Non-Shopify Brands
Brands outside Shopify can now list products in Shopify’s catalog and sell through AI platforms like ChatGPT, Microsoft Copilot, Google AI Mode, and Gemini. Existing Shopify merchants can also use Agentic Storefronts to manage all AI sales from one dashboard without extra fees or integrations.
📲 Spotify Tests Swipeable Carousel Ads
The new carousel ads let brands show up to six swipeable cards, each with its own image, link, and pricing. Spotify is testing the format in the U.S. and UK, alongside updates to Sponsored Playlists and Ads Manager tools that allow split testing and automated bidding.
🧠 Amazon Pitches Ad Plans for Rufus AI Assistant
A leaked pitch deck shows Amazon plans to move Rufus ads out of open beta, introducing cost-per-click payments for advertisers. The deck highlights 250 million active users and claims Rufus interactions lead to 60% higher purchase completion.
Recommendations
✦ Granola: The best AI Meeting Notetaker
Double the length of your free trial with our link
✦ Wynter: A side-hustle for B2B marketers
Get paid up to $50 for completing 2-10 min feedback surveys
Note: We may receive sponsorship payment or affiliate commission for links above
You’re caught up
🎊



