adlibrary.com Logoadlibrary.com
Share
Advertising Strategy,  SEO & Content Strategy

What Is a View-Through Conversion? A 2026 Attribution Guide for Marketers

View-through conversions look enormous in your dashboard — but how many are real? Learn the exact windows Meta, TikTok, and Google use, post-iOS 14 SKAdNetwork caveats, and when to replace VTC with MMM and incrementality testing.

View-through conversion vs click-through attribution timeline diagram showing certainty gradients and measurement windows

View-through conversions show up in your Meta Ads Manager dashboard looking enormous. Hundreds of them. Maybe thousands. And almost none of them came from someone clicking your ad.

That's the view-through conversion mechanic: a user sees your ad, doesn't click, then converts within a defined window — and the platform credits the impression. The question isn't whether this happens. It does. The question is whether that credit means anything, how much to discount it, and when to stop using it as a KPI altogether.

TL;DR: A view-through conversion (VTC) is credited when a user sees an ad without clicking, then converts within a platform-defined window (typically 1–7 days). VTCs are directionally useful for upper-funnel creative and brand campaigns — but dangerous as standalone KPIs because they're largely unverifiable, platform-reported, and inflated by iOS 14+ attribution gaps. Pair them with Marketing Mix Modeling (MMM) and geo-holdout incrementality tests before citing them to leadership.

Step 0: What you're actually measuring before you optimize

Before interpreting any view-through conversion number, find the ad. Pull the creative that's accumulating those VTCs and look at what it's actually doing — who ran it, for how long, at what frequency, and against what audience. An impression recorded by a 0.5-second scroll-past is not the same as a 15-second video completion, even though both can generate a view-through conversion.

On adlibrary, you can pull the full creative timeline for any active brand — see how long a specific ad has been running, which markets it's targeting, and whether competitors running similar upper-funnel formats are leaning on VTC as their primary signal or using it as one indicator among several. That context matters before you decide whether your VTC number is a legitimate signal or a platform-assigned credit that doesn't survive scrutiny.

What is a view-through conversion, precisely

A view-through conversion records a sale, lead, or other defined action attributed to an ad impression — with no click in the path. The user saw the ad. Some time later, they converted. The platform connects the two events and adds one to your VTC count.

This differs from a click-through conversion in a critical way: there is no causal action in between. A click establishes intent. A view establishes exposure. These are not equivalent signals.

The third variant worth understanding is the engaged-view conversion, which Meta introduced as a middle ground: a user watched at least 10 seconds of a video (or to 97% completion on shorter clips) without clicking, then converted. This sits between pure VTC and click-through — it requires demonstrated attention rather than passive delivery.

For definitions of multi-touch attribution more broadly, including last-click, data-driven, and position-based models, that context is useful background but not the focus here. The relevant UTM parameters setup also matters: if UTM tracking isn't consistent across your upper-funnel placements, you lose the ability to cross-reference VTC against Google Analytics behavior data — a check that often reveals how many view-through attributees actually came back through organic search before purchasing.

View-through windows: Meta, TikTok, and Google defaults

Each platform sets its own default window. These windows are not neutral choices — they determine how many conversions get attributed, and a longer window means more credit flows back to the platform.

Meta Ads Manager defaults

Meta's default attribution window is 7-day click, 1-day view. This means:

  • A click gets 7 days of attribution credit
  • A view gets 1 day

You can adjust this in Ads Manager under Campaign Settings → Attribution Setting. Options include 7-day click only, 1-day click + 1-day view, 7-day click + 1-day view (default), and 7-day click + 7-day view.

The 7-day view window exists. Most accounts run on the 1-day view default. But if someone changed that setting and didn't document it, your VTC numbers can be dramatically different from what you think you're measuring.

TikTok Ads Manager defaults

TikTok defaults to 7-day click, no view-through for most objectives — but enables view-through attribution for Video Views and Reach objectives. When active, the default view window is 1 day. You can find current attribution settings in TikTok Ads Manager under Campaign → Attribution.

TikTok's engaged-view model operates separately: a 6-second video view is the threshold for their version of engaged-view attribution on certain campaign types.

Google's view-through conversions apply specifically to Display Network and YouTube campaigns. The default window is 30 days — significantly longer than Meta or TikTok. A user who saw your display banner in January and bought in late January gets credited.

Google also distinguishes between view-through and engaged-view: for YouTube, a user who watched 30 seconds of a skippable ad (or the whole ad if shorter) and converted is counted differently from someone who just saw the pre-roll for 5 seconds before skipping.

The 30-day window on Display makes Google's VTC numbers the most inflated by default. An account running heavy Display retargeting can accumulate thousands of view-through conversions per month that overlap almost entirely with organic search converters.

Post-iOS 14: SKAdNetwork and AdAttributionKit caveats

The view-through conversion problem got structurally worse after iOS 14. Apple's SKAdNetwork (and its successor, AdAttributionKit, introduced with iOS 17.4) changed how mobile app attribution works — and the knock-on effects hit web attribution too.

What SKAdNetwork changed

SKAdNetwork is Apple's privacy-preserving attribution framework. It sends conversion values from the OS to ad networks without exposing user-level data. The framework has a built-in delay (24–48 hours minimum before postback, up to 35 days for fine-grained values) and a strict hierarchy of credit — one network gets the win.

View-through attribution under SKAdNetwork requires the user to see the ad for at least 2 seconds. The ad network must register a view impression before the conversion happens. This creates a meaningful bar — but it also means that view-through credits in SKAdNetwork are less inflated than non-SKAdNetwork VTCs, because Apple enforces the impression registration.

AdAttributionKit (iOS 17.4+)

AdAttributionKit replaced SKAdNetwork as the primary framework for iOS app attribution from iOS 17.4 onward. It extended the framework to web-to-app flows and introduced re-engagement attribution. The view-through window under AdAttributionKit is configurable by the ad network within Apple's limits — but the 2-second minimum view requirement carries over.

The practical effect: if you're running Meta app install campaigns on iOS, your VTC numbers in Meta Ads Manager are modeled estimates (from Meta's Conversions API and their probabilistic signals), while AdAttributionKit postbacks are the privacy-compliant signal. These two numbers rarely match. Sometimes they're off by 40–60%.

The Conversions API (CAPI) gap

Post-iOS 14, Meta pushed advertisers toward CAPI to recover lost signal. CAPI helps with click-through attribution recovery — sending server-side events that would otherwise be blocked by ITP or ATT. But CAPI doesn't eliminate the view-through attribution problem; it can actually increase VTC counts because server-side events get matched against Meta's modeled user graph, and that matching can credit views that wouldn't have been captured pixel-only.

The attribution error documented in the Meta Ads performance dip article shows this concretely: CAPI + modeled attribution can generate VTC inflation that looks like performance when it's actually double-counting.

When view-through conversion is signal vs noise

This is where most guides go vague. Here's a more direct framework.

VTC as signal (use it)

View-through conversions carry genuine signal when:

  1. The campaign goal is brand awareness or upper-funnel reach. If you're running video to cold traffic with no click objective, VTC is often the only attribution you'll get. Discounting it entirely makes the campaign look like it did nothing — which may not be true.

  2. The product has a long consideration cycle. A B2B SaaS tool or a $2,000 consumer purchase doesn't convert on first exposure. Someone who saw your ad, spent three weeks researching, and then bought — that view may have genuinely started the path. One-day windows probably miss this; seven-day windows probably catch it, with noise.

  3. You can cross-reference with MMM. If your Marketing Mix Model shows incremental lift in the same cohort where VTC is high, you have corroborating evidence. VTC alone isn't enough; VTC + MMM agreement is meaningful.

  4. Frequency is high. An audience that saw your ad 6+ times before converting is different from one that saw it once. High-frequency VTC in a small, well-defined audience is more credible than low-frequency VTC across a broad lookalike audience. See Meta ad benchmarks by industry for how frequency norms vary by vertical — what's "high frequency" in apparel differs significantly from financial services.

VTC as noise (discount it)

View-through conversions are effectively noise when:

  1. The campaign is lower-funnel retargeting. Someone in your retargeting pool was going to convert anyway. A view-through conversion credit on a retargeting ad is correlation, not causation.

  2. The window is 7+ days with no frequency control. A 7-day view window across a broad audience can capture people who saw your ad once, forgot about it, saw a competitor's ad, read three reviews, and then bought. That's not your VTC. That's their purchase journey.

  3. VTC count is larger than click-through count by 5x or more. At that ratio, you're almost certainly over-crediting. The platform is giving itself credit for conversions that would have happened regardless.

  4. The conversion event is not a purchase. VTC on lead gen forms, page views, or video completions is essentially meaningless — the user had to do nothing except exist in the attribution window.

The ROAS calculator is useful here: if you strip VTC from your conversion count and recalculate, how does your ROAS change? If it drops from 4x to 1.2x, you have a VTC inflation problem masquerading as performance.

Upper-funnel versus lower-funnel view-through conversion signal matrix showing when the metric is reliable versus noisy

The replacement stack: MMM, incrementality, and triangulation

The goal isn't to stop measuring upper-funnel impact. It's to stop using a single, platform-reported, non-auditable number as the primary evidence that your ad spend worked.

Marketing Mix Modeling (MMM)

MMM uses regression analysis on aggregated sales data across channels to estimate the contribution of each media input — without relying on user-level tracking. Because MMM is privacy-safe and works on aggregate data, it sidesteps the iOS 14 signal loss problem entirely.

Meta's open-source Robyn and Google Meridian are the two most-cited MMM frameworks in 2026. Robyn is an R-based model released by Meta's Marketing Science team — source. Google Meridian is Python-based and built for Bayesian MMM with prior calibration. Both can ingest your spend and revenue data and return channel-level attribution coefficients that don't rely on platform-reported conversions at all.

The limitation: MMM requires at least 12–18 months of clean historical data to produce reliable coefficients, and it operates at the channel level — not the creative or campaign level. You learn "Meta video drove X% of revenue" but not "that specific UGC ad drove it."

Geo-holdout incrementality tests

A geo-holdout test splits your market into two geographic groups — one sees your ads, one doesn't. After a defined period, you compare conversion rates between the two groups. The difference is your incremental lift.

This is the cleanest measurement approach available without a randomized controlled trial. It removes all attribution model assumptions because you're comparing actual behavior, not platform-reported credits.

The practical constraint: geo tests require a minimum spend level to generate statistical significance (rough floor: $50k–100k over 4–6 weeks per test), and you need markets that are genuinely separable. Overlapping DMAs or national campaigns make clean geo splits difficult.

Meta offers Conversion Lift studies as a managed option — they run the holdout experiment for you. The reported "incremental conversions" from a Meta Lift study is a more trustworthy number than view-through conversion count, because it's based on a control group comparison, not a platform attribution model.

Triangulation: the only safe interpretation

No single measurement tells the full story. Analysts running the media-buyer-workflow typically check three or four signals before flagging a campaign as working. The framework that holds up in 2026:

  1. Platform-reported conversions (click-through primary, VTC as supplementary upper-funnel indicator — heavily discounted at 0.1–0.3x weighting if mixing into a blended number)
  2. Marketing Efficiency Ratio (MER) — total revenue divided by total ad spend, no attribution model required. If MER improves when you scale a channel, that channel is probably working.
  3. MMM coefficients — channel-level incremental contribution, updated quarterly
  4. Geo lift tests — campaign-level incrementality, run 2–4x per year on highest-spend channels

When all four directionally agree, you have a defensible case. When VTC is high but MER is flat and MMM shows no incremental lift — the VTC is noise.

The campaign benchmarking workflow maps this triangulation approach to the actual reporting cadence most media teams follow: weekly platform checks, monthly MER reviews, quarterly MMM updates.

Why view-through attribution keeps getting cited anyway

The honest answer: because platforms benefit from it.

A platform that credits view-through conversions reports a higher ROAS for your campaigns. Higher ROAS means you reinvest more budget. This isn't a conspiracy — it's incentive alignment. Every ad platform has an inherent interest in attribution models that credit more conversions to itself.

Triple Whale popularized the "Blended ROAS" and "True ROAS" framing precisely because DTC brands were discovering that Meta-reported ROAS and actual business performance were diverging. Their model discounts view-through conversions significantly (typically 10–20% of face value) and weights click-through much more heavily. The improve ROAS guide walks through how to build this blended model without relying on a specific third-party tool — including how to treat viewability data as a quality filter on impressions before they even enter your VTC count.

The optimization event you choose also determines how inflated your VTC count gets. Optimizing for Purchase generates fewer VTCs than optimizing for Add to Cart or Initiate Checkout — because the purchase event is rarer, so there's less opportunity for the platform to credit views against it.

Looking at patterns across thousands of in-market ads on adlibrary's ad intelligence database, brands running direct-response creatives with a clear CTA tend to show dramatically higher click-through rates and correspondingly lower VTC-to-CTC ratios — which suggests their VTC numbers are more credible, not because the metric is more accurate, but because fewer conversions are being silently absorbed into the view-attribution bucket. You can trace this pattern through ad-timeline-analysis: creatives that run for 14+ days without rotation tend to accumulate disproportionate view-through credits as frequency climbs and click-through intent drops.

How to set VTC windows deliberately in Meta Ads Manager

Most accounts run on Meta's default (7-day click, 1-day view) without ever reviewing it. Changing this setting mid-campaign resets your view-through conversion history, making trend comparison unreliable — so audit before launch. Here's how to set attribution windows intentionally:

  1. Campaign level: In Ads Manager, open an existing campaign → Edit → Attribution Setting. You'll see the current window.
  2. Account level default: Go to Business Settings → Data Sources → Pixels → your pixel → Attribution. This sets the fallback for all campaigns.
  3. Reporting column: Add "View-through conversions" as a column in your Ads Manager breakdown to see VTC separately from click-through. Never evaluate a campaign using the combined "Conversions" metric without knowing its composition.
  4. Compare windows: Use the Attribution Settings comparison tool in Ads Manager to see how your results change across different windows simultaneously. A campaign that looks profitable on 7-day view and barely breaks even on 1-day click deserves investigation.

For Advantage+ campaigns, Meta controls creative selection and delivery algorithmically — but attribution window settings still apply. If you're running Advantage+ Shopping Campaigns (ASC), check the attribution window before pulling performance numbers, because ASC's broad audience reach increases the pool of impressions eligible for view-through credit.

The Meta ads learning phase is also relevant here: during the learning phase, Meta's algorithm is actively optimizing delivery, which can temporarily inflate VTC as the system casts a wide net. Don't evaluate VTC numbers from campaigns that haven't exited the learning phase.

Frequently Asked Questions

What is a view-through conversion?

A view-through conversion is recorded when a user sees an ad (without clicking it) and then converts — makes a purchase, fills out a form, or completes another defined action — within a specified attribution window, typically 1 to 7 days. The platform attributes that conversion to the ad impression even though no click occurred.

How is view-through different from click-through attribution?

Click-through attribution requires the user to click the ad before converting. View-through attribution requires only that the ad was delivered and seen. Click-through establishes a causal action; view-through establishes exposure. Most measurement frameworks treat them differently, weighting click-through conversions at full value and discounting view-through by 70–90%.

Why are my view-through conversions so high after iOS 14?

Post-iOS 14 ATT (App Tracking Transparency), Meta and other platforms lost significant user-level signal on iOS. To compensate, they expanded their use of modeled attribution — estimating conversions based on aggregate signals and probabilistic matching rather than deterministic tracking. This modeling process tends to assign more view-through credits because the system can't verify that a click occurred, so it falls back on the last-known impression. Pairing CAPI with your pixel helps recover deterministic click-through signal but doesn't reduce modeled VTC.

Should I include view-through conversions in my ROAS calculation?

Only with explicit discounting. A commonly used approach: include VTCs at 10–20% face value in blended ROAS calculations. Never count VTC at 1:1 with click-through conversions unless you have geo-holdout evidence that those views genuinely drove incremental purchases. The ROAS calculator lets you model different VTC weighting scenarios before committing to a reporting framework.

What is the difference between view-through and engaged-view conversion?

An engaged-view conversion requires a minimum level of video engagement before the user converts — typically 10+ seconds watched (Meta) or 6 seconds (TikTok). Standard view-through attribution has no engagement threshold; delivery alone qualifies. Engaged-view is a more credible signal because it requires demonstrated attention, but it's still not equivalent to a click.


View-through conversion is a loan the ad platform takes against reality — directionally useful for upper-funnel creative decisions, dangerous as a standalone KPI. The only way to know whether those views actually moved revenue is to measure incrementally: geo holdouts, MMM, or at minimum a triangulated MER read.

Before citing VTCs to leadership, pull the metric apart: what window, what campaign type, what VTC-to-CTC ratio. The number that survives that audit is the one worth defending.

Related Articles