In our experience managing mobile ad spend across Meta's platform, we've seen firsthand how misunderstanding the ad auction leads to wasted budgets and broken scaling strategies. Most advertisers think the auction is just about who bids the highest. It's not.
The reality is far more nuanced, and understanding the mechanics gives you a structural advantage over competitors who are just throwing money at the problem.
Page Contents
- How does Meta's ad auction actually work?
- What is Meta's Estimated Action Rate and how is it calculated?
- What is Meta's learning phase and why does it matter?
- Why does creative quality affect ad delivery on Meta?
- How does Meta's algorithm learn which users to target?
- Does putting all creatives in one ad set hurt Meta auction performance?
- What role does emotional resonance play in winning Meta's ad auction?
- Can AI-generated creatives improve your Meta auction performance?
- Related Reading
How does Meta's ad auction actually work?
Meta's ad auction determines which ad to show each user by calculating a Total Value score for every competing ad. The formula is: Total Value = Bid × Estimated Action Rate × Ad Quality. The highest Total Value wins the impression, not the highest bid.
This means an advertiser bidding $5 with a 3% estimated conversion rate and high ad quality can beat an advertiser bidding $15 with a 0.8% conversion rate and mediocre quality. Meta designed it this way because showing relevant ads keeps users on the platform longer, which is worth more to Meta than extracting maximum CPMs from a single auction—though choosing the wrong bidding strategy wastes budget on low-value installs even when you understand the auction mechanics.
- Bid: The maximum amount you're willing to pay for your target action (install, purchase, etc.)
- Estimated Action Rate (EAR): Meta's prediction of how likely this specific user is to take your desired action
- Ad Quality: A composite score based on engagement signals, post-click behavior, feedback (hide/report rates), and content quality
Why this matters for your CPI
Because Total Value is multiplicative, improving any single component has an outsized impact. In our experience, improving creative quality (which lifts both EAR and the quality score) can meaningfully reduce CPI without changing bids at all. This is why creative strategy isn't just a 'nice to have.' It's a core auction mechanic.
What is Meta's Estimated Action Rate and how is it calculated?
The Estimated Action Rate (EAR) is Meta's prediction of the probability that a specific user will take your optimization event (install, purchase, etc.) after seeing your ad. It's calculated using a massive machine learning model trained on billions of data points across the entire Meta ecosystem.
Meta's model considers hundreds of signals: the user's past behavior (what ads they've clicked, what apps they've installed, what they've purchased), the time of day, their device, their recent browsing patterns, and critically, the creative being shown.
Two ads in the same campaign can have wildly different EARs for the same user because the model predicts action rate at the creative level, not just the campaign level.
The Bayesian optimization layer
Under the hood, Meta uses a form of Bayesian optimization to update these predictions in real time. Each impression outcome (did the user convert or not?) updates the model's posterior probability distribution. Early on, with limited data, the model has high uncertainty and explores broadly.
As conversions accumulate, the confidence intervals narrow and the model exploits what it has learned. This is fundamentally why you need conversion volume to scale—and why Meta’s Conversions API signal quality is critical, especially on iOS where ATT opt-out rates mean SDK alone can miss a substantial share of post-install events. The model literally cannot optimize without sufficient signal.
What is Meta's learning phase and why does it matter?
The learning phase is the period when Meta's algorithm is actively exploring which users, placements, and creative combinations drive conversions for your ad set. Meta officially states that an ad set needs approximately 50 conversion events per week to exit the learning phase, and in practice we've found that consistently hitting or exceeding that threshold tends to produce more reliable results.
During the learning phase, performance is volatile. CPIs can swing significantly day over day because the algorithm is still exploring. This is normal and expected.
The worst thing you can do is panic and make significant changes (budget shifts over 20%, new creatives, audience changes) because each significant edit resets the learning phase.
This is a key consideration when thinking about campaign structure for Meta app ads, since consolidating ad sets helps you hit that 50-event threshold faster and exit learning phase sooner.
- 50+ optimization events per week per ad set is the target to exit learning
- Budget changes over 20% in a single day can reset learning
- Adding new creatives or changing optimization events triggers a partial or full reset
- Ad sets stuck in 'Learning Limited' are a signal to consolidate or increase budget
Why does creative quality affect ad delivery on Meta?
Creative quality directly affects two of the three components in Meta’s Total Value equation: Estimated Action Rate and Ad Quality score. A better creative doesn’t just get more clicks—it literally wins more auctions at lower prices because it raises your Total Value without requiring higher bids. In fact, accounts testing new creative concepts monthly than those running just 3-5 creatives.
The ad quality feedback loop
Meta tracks post-impression signals like engagement rate, video watch time, click-through rate, hide/report rates, and post-click conversion rates. Ads that users engage with positively get higher quality scores, which means they win more auctions, which means they get more data, which means the algorithm learns faster.
Poor creatives enter a death spiral: low engagement leads to lower quality scores, fewer impressions, less data, and eventually the ad set flatlines. When this happens and your Meta CPI starts spiking unexpectedly, creative fatigue is a common culprit, with CTR declining meaningfully in a matter of days as audiences become oversaturated.
Psychology-driven creative outperforms
This is why the creative angle matters so much. As Bastian Bergmann of Solsten discussed on the Mobile User Acquisition Show, psychology-based creative changes can dramatically outperform algorithmic optimization alone—which is precisely why how many creatives per ad set is critical to give each angle room to find its audience without creating audience confusion.
For Solitaire Klondike, shifting copy from 'train your brain' to 'hardest solitaire game' based on player psychological profiling improved IPM from 0.97 to 2.4. That kind of lift in engagement feeds directly into both the EAR and quality score components of the auction.
How does Meta's algorithm learn which users to target?
Meta's algorithm uses a form of explore-exploit optimization. It starts by showing your ad to a broad sample of users within your targeting parameters, observes who converts, and then progressively narrows delivery toward similar users. This is why conversion events are the fuel that powers Meta's targeting engine.
The algorithm builds what’s essentially a multi-dimensional user embedding. Users who convert get mapped in feature space, and the model looks for other users nearby in that space. This is the same principle behind Meta’s lookalike audience functionality, where seed audiences are used to find similar high-value users. Early conversions have an outsized influence on who the algorithm targets next.
This is one reason why broad targeting often outperforms interest targeting on Meta for apps with sufficient budget, because it gives the algorithm maximum room to find the right users rather than constraining it with human assumptions about who will convert.
- The algorithm explores broadly during the first 50 conversions, then narrows
- Each conversion teaches the model what a 'good user' looks like for your app
- Creative selection influences which users respond, which shapes the entire targeting loop
- Garbage data (low-quality conversions) teaches the algorithm to find more low-quality users
Does putting all creatives in one ad set hurt Meta auction performance?
Yes. This practice, known as 'asset stuffing,' is one of the most common mistakes we see. When you dump all your creatives into a single ad set without thematic separation, you prevent the algorithm from cleanly identifying which audience segments respond to which creative angles.
As covered in a deep dive on the perils of asset stuffing, the core problem is that different creatives appeal to fundamentally different user segments—and stuffing them together prevents the algorithm from accumulating the ~50 focused conversion events needed to exit learning phase cleanly. A gameplay-focused ad attracts a different user than an emotional story-driven ad.
When they're all in one ad set, the algorithm tries to find a single audience that responds to everything, which is usually nobody. The solution is to separate creatives thematically into distinct ad sets so the algorithm can match the right creative angle to the right audience segment.
What role does emotional resonance play in winning Meta's ad auction?
Emotional resonance is one of the most underleveraged levers for auction performance. Ads that trigger genuine emotional responses (curiosity, surprise, tension, even frustration) generate higher engagement rates, longer watch times, and better click-through rates. All of these signals feed directly into Meta's quality and EAR components.
The team at Tactile Games (Lily's Garden) discovered this when they analyzed the competitive landscape and found that the vast majority of competing ads clustered around the same narrow emotional registers. By deliberately exploring sadness, anger, and anxiety in their creatives, they stood out in the auction and drove stronger engagement.
As Gonzalo Fasanella shared on the Mobile User Acquisition Show, they focused on emotional resonance precisely because users scroll away from ads within 30 seconds. You need to hook them emotionally, not just visually.
Can AI-generated creatives improve your Meta auction performance?
AI-generated creatives can improve auction performance, but only when used strategically within frameworks like Dynamic Creative Optimization for mobile apps, where Meta Advantage+ Creative supports up to 10 images or videos and automatically tests combinations. The volume advantage is real: more creative variants means more chances to find winning hooks and angles. However, there are critical pitfalls that can actually hurt performance.
The three biggest risks are: producing low-quality output because the inputs (briefs, audience understanding) were poor; hitting a local maximum by only iterating on past winners instead of exploring new angles; and underestimating hidden testing costs, since more creative volume requires proportionally larger test budgets to reach statistical significance.
At RocketShip HQ, we use AI to accelerate production of creative variants within proven thematic frameworks, not as a replacement for strategic creative thinking.
- Garbage in, garbage out: AI creatives without audience and format consideration produce noise
- Local maxima risk: only iterating on winners leads to diminishing returns
- Hidden testing costs: more creatives means proportionally more test budget is needed for valid results
- Best use: AI for variant production within human-defined strategic frameworks
Meta's ad auction rewards relevance, not just budget. The Total Value equation means that creative quality, emotional resonance, and proper campaign structure are structural advantages that compound over time.
If you want to win more auctions at lower costs, invest in understanding the mechanics and let the algorithm do what it does best with the right inputs. At RocketShip HQ, this framework drives every dollar we deploy for our clients.
Looking to scale your mobile app growth with performance creative that delivers results? Talk to RocketShip HQ to learn how our frameworks can work for your app.
Not ready yet? Get strategies and tips from the leading edge of mobile growth in a generative AI world: subscribe to our newsletter.
Related Reading
- Meta Ads for mobile apps: the complete playbook (comprehensive guide covering why Meta commands over 25% of total mobile ad spend globally)
- Does Broad Targeting Outperform Interest Targeting on Meta?
- Running Meta Ads for Mobile Apps
- What Campaign Structure Should You Use for Meta App Ads?
- Optimize Meta Campaigns for ROAS