Back to Resources

How to Build a ChatGPT Ads Reporting Dashboard from Scratch

A complete blueprint for building ChatGPT ads reporting when OpenAI only provides impressions and clicks. Covers the 4-layer measurement stack, UTM setup, GA4 configuration, CRM integration, Looker Studio dashboards, attribution models, and incrementality testing.

Sofia14 min read

What OpenAI actually reports (and what’s missing)

OpenAI’s ad reporting dashboard gives you two metrics: impressions and clicks. That is the entirety of the native data. There are no conversion events, no revenue attribution, no return on ad spend, and no breakdown by device, geography, or demographic.

Compare that to what Google Ads offered in 2002 when it first launched AdWords – basic click and impression counts with minimal segmentation. Two decades later, Google Ads provides hundreds of dimensions and metrics. ChatGPT ads are at the beginning of that same journey, which means advertisers need to build the measurement infrastructure themselves.

Here is exactly what OpenAI does and does not report:

AvailableMissing
ImpressionsConversion tracking
ClicksQuery-level / prompt-level data
Demographics and audience segments
Device breakdown
Placement reporting
ROAS / revenue attribution
Post-view / view-through attribution

This gap is not a reason to avoid ChatGPT ads. It is a reason to build your own reporting infrastructure. The advertisers who invest in measurement now will have a compounding advantage as the platform matures, because they will be optimizing against real business outcomes while competitors are still guessing based on impressions and clicks.

The 4-layer measurement stack

A complete ChatGPT ads measurement system has four layers. Each layer captures data that the others miss, and together they give you 85–95% visibility into the actual return on your ad spend.

Layer 1: UTM parameters. Every destination URL on every ChatGPT ad gets tagged with structured UTM parameters. This is the foundation. UTMs let your analytics platform (GA4) recognize ChatGPT as a distinct traffic source and attribute on-site behavior to specific campaigns and creatives.

Layer 2: GA4 event tracking. GA4 captures what happens after the click – page views, scroll depth, form submissions, demo requests, purchases, and any other conversion event you define. With UTM data flowing in, you can isolate ChatGPT traffic and build conversion funnels specific to that channel.

Layer 3: CRM pipeline attribution. For B2B companies and high-consideration purchases, the real value of a lead is not the form fill – it is the pipeline it generates. Your CRM tracks leads from initial capture through MQL, SQL, opportunity, and closed-won stages, tagged by original source.

Layer 4: Post-purchase surveys. Surveys capture the “dark funnel” – users who saw your ChatGPT ad, were influenced by it, but converted through a different channel like branded search or direct visit. Adding “ChatGPT” as an explicit source option on your post-purchase survey fills the attribution gap that digital tracking cannot.

Each layer feeds into your centralized dashboard. UTMs provide the clickstream data. GA4 provides the conversion events. Your CRM provides the revenue data. Surveys provide the self-reported influence data. When all four layers are connected, you can calculate true cost per acquisition and blended ROAS with confidence.

UTM setup for ChatGPT ads

UTM parameters are the connective tissue between your ChatGPT ad spend and your analytics platform. Without consistent UTM tagging, ChatGPT traffic shows up in GA4 as “unattributed” or gets lumped into “direct” – both of which make performance analysis impossible.

Use this structure for every ChatGPT ad destination URL:

?utm_source=chatgpt&utm_medium=cpc&utm_campaign={cluster}&utm_content={creative-id}

  • utm_source=chatgpt – identifies the traffic source as ChatGPT
  • utm_medium=cpc – classifies it as paid traffic
  • utm_campaign – maps to your topic cluster or campaign name (e.g., “project-management” or “spring-2026-launch”)
  • utm_content – identifies the specific creative variant (e.g., “headline-v2-desc-v3”)

Example destination URLs:

  • https://yoursite.com/demo?utm_source=chatgpt&utm_medium=cpc&utm_campaign=crm-for-startups&utm_content=free-trial-v1
  • https://yoursite.com/pricing?utm_source=chatgpt&utm_medium=cpc&utm_campaign=accounting-tools&utm_content=flat-fee-v2
  • https://yoursite.com/lp/remote-teams?utm_source=chatgpt&utm_medium=cpc&utm_campaign=pm-remote-teams&utm_content=slack-integration-v1

Create a shared spreadsheet template where your team logs every ad creative alongside its full UTM-tagged URL. This prevents inconsistencies (like one person using “chatgpt” and another using “chat-gpt”) that fragment your data. Include columns for creative ID, headline text, description text, destination URL, UTM campaign, UTM content, launch date, and status.

Tag every URL before launch. Retroactively adding UTMs means lost data for every click that happened before the fix. There is no way to recover untagged traffic in GA4.

GA4 configuration

With UTMs in place, GA4 can recognize ChatGPT as a distinct traffic source. But you need to configure GA4 properly to get actionable data out of it.

Set up ChatGPT as a recognized source

By default, GA4 does not recognize “chatgpt” as a known traffic source. It will classify your UTM-tagged traffic under the generic “Unassigned” channel group unless you create a custom channel group.

In GA4 Admin, go to Channel Groups and create a custom channel that matches source = chatgpt and medium = cpc. Name it “ChatGPT Ads” or “ChatGPT CPC.” This ensures ChatGPT traffic appears as its own line item in every channel-level report, not buried inside “Other.”

Define custom conversion events

Set up GA4 events for every meaningful on-site action. The specific events depend on your business model, but common ones include:

  • generate_lead – form submission, demo request, or quote request
  • sign_up – account creation or free trial start
  • purchase – completed transaction with revenue value
  • add_to_cart – product added to cart (e-commerce)
  • scroll_depth_75 – user reached 75% of page (content engagement)

Mark each event as a conversion in GA4 > Admin > Events. This lets you build conversion-rate reports filtered by ChatGPT traffic.

Build a ChatGPT-specific exploration report

Create an Exploration report in GA4 that filters exclusively for source = chatgpt. Include dimensions like campaign, content (creative ID), landing page, and device category. Include metrics like sessions, engaged sessions, engagement rate, conversions, and conversion rate.

This report becomes your operational view for ChatGPT ad performance. You can see which campaigns drive traffic, which creatives drive engagement, which landing pages convert, and where users drop off in the funnel.

Map the conversion funnel

For ChatGPT ads, the typical funnel has four stages:

  1. Ad click – user clicks the ChatGPT ad (tracked by OpenAI)
  2. Landing page visit – user arrives on your site (tracked by GA4 via UTM)
  3. Micro-conversion – user takes an engagement action like scrolling, clicking a CTA, or starting a form (tracked by GA4 events)
  4. Macro-conversion – user completes the primary goal like a purchase, demo request, or signup (tracked by GA4 events)

Build a Funnel Exploration in GA4 using these four steps, filtered to source = chatgpt. The funnel visualization shows exactly where ChatGPT traffic drops off and where the conversion bottlenecks are.

CRM integration

GA4 tells you what happens on your website. Your CRM tells you what happens after the website – which leads become qualified, which enter the sales pipeline, and which generate revenue. For B2B companies, the CRM is where you measure real ROI.

Tag leads at the point of capture

When a lead submits a form on your site, pass the UTM parameters into your CRM record. Most form tools (HubSpot Forms, Typeform, Gravity Forms) can capture hidden UTM fields automatically. The lead record should include source = chatgpt, the campaign name, and the creative ID. This lets you trace every CRM record back to the specific ChatGPT ad that generated it.

Track pipeline stages

Follow ChatGPT-sourced leads through your standard pipeline stages:

  • Lead – initial form submission or signup
  • MQL (Marketing Qualified Lead) – meets your marketing qualification criteria
  • SQL (Sales Qualified Lead) – accepted by sales for follow-up
  • Opportunity – active deal in your pipeline with estimated value
  • Closed-Won – deal closed, revenue recognized

At each stage, you can calculate stage-specific conversion rates for ChatGPT leads. This is where the quality story emerges. If ChatGPT leads convert from MQL to SQL at 40% while Google leads convert at 25%, that changes the cost-per-lead comparison entirely.

Calculate pipeline influence

Pipeline influence measures the total dollar value of deals that ChatGPT ads touched at any point in the buyer’s journey. Create a CRM report that shows total pipeline value where original_source = chatgpt, broken down by stage. This gives your executive team a concrete revenue number tied to ChatGPT ad spend.

Compare lead quality across channels

Build a CRM report that compares ChatGPT-sourced leads against Google and Meta on the metrics that matter: MQL-to-SQL rate, SQL-to-Opportunity rate, average deal size, and sales cycle length. Early data suggests ChatGPT leads tend to be more qualified because users describe their needs in detail before clicking, which acts as a natural pre-qualification filter. Your CRM data will confirm whether this holds true for your specific business.

Building the Looker Studio dashboard

Looker Studio (formerly Google Data Studio) is the best free option for building a unified cross-channel dashboard. It connects natively to GA4 and supports 800+ data sources through built-in and third-party connectors.

Step 1: Connect GA4

Add GA4 as a data source in Looker Studio. Select the property that receives your ChatGPT UTM-tagged traffic. GA4 is a native connector, so no third-party tools are needed. Once connected, you have access to all sessions, events, and conversions segmented by source, campaign, and content.

Step 2: Add Google Ads and Meta Ads

To compare ChatGPT performance against Google and Meta, you need those platforms’ data in the same dashboard. GA4 captures on-site behavior from all channels, but you also want platform-level spend and impression data.

Use a connector like Supermetrics or Windsor.ai to pull Google Ads and Meta Ads data directly into Looker Studio. Both tools support scheduled data refreshes, so your dashboard stays current automatically. Windsor.ai supports 70+ marketing data sources, while Supermetrics covers most major ad platforms including Google, Meta, LinkedIn, and TikTok.

Step 3: Blend data on date and campaign

Use Looker Studio’s data blending feature to join your GA4 data with your Google Ads and Meta Ads data. Blend on the date dimension first, then add campaign as a secondary join key where naming conventions align. This creates a unified dataset where you can compare spend, clicks, conversions, and ROAS across all three platforms side by side.

Step 4: Build scorecards for key metrics

Add scorecard widgets for the metrics your team checks first: total spend, total conversions, blended CPA, blended ROAS, and cost per lead by channel. Use conditional formatting to highlight when a metric is above or below target.

Step 5: Add trend charts

Add time-series charts for impressions, clicks, CTR, conversions, and CPA, with each platform as a separate series. This lets you spot trends, seasonality, and the impact of creative changes or budget shifts over time. Include a date range selector so stakeholders can toggle between daily, weekly, and monthly views.

Cross-channel comparison template

The most valuable view in your dashboard is a cross-channel comparison table that puts ChatGPT alongside Google and Meta with consistent metrics. This is the table your CMO will look at first.

MetricChatGPTGoogle AdsMeta Ads
Impressions83,300250,000500,000
Clicks4177,5005,000
CTR0.50%3.00%1.00%
CPC$12.00$4.00$2.00
LP CVR4.8%3.2%2.1%
Leads20240105
Cost / Lead$250$125$95
Pipeline Value$120,000$960,000$315,000
Revenue$36,000$192,000$63,000
ROAS7.2x6.4x6.3x

This template is illustrative. Your actual numbers will vary. The important thing is the structure: start with top-of-funnel metrics (impressions, clicks, CTR) where Google and Meta look stronger due to volume, then follow through to bottom-of-funnel metrics (pipeline value, revenue, ROAS) where ChatGPT’s higher intent signal often closes the gap or takes the lead.

Build this table in Looker Studio using blended data sources so it updates automatically. Add a date filter so you can compare performance across time periods and spot trends.

Attribution models that work

Attribution determines how you assign credit for conversions across touchpoints. The model you choose changes the story your data tells, so picking the right one matters.

Last-click attribution

Last-click gives 100% of the conversion credit to the final touchpoint before the conversion. It is the simplest model and the easiest to implement.

Advantage: straightforward, easy to explain to stakeholders, no modeling required. Disadvantage: it misses ChatGPT’s influence when users discover you through a ChatGPT ad but convert later through branded search or a direct visit. If your sales cycle is longer than one session, last-click will systematically undervalue ChatGPT ads.

Time-decay attribution

Time-decay gives more credit to touchpoints closer to the conversion, with credit declining for earlier touchpoints. A common implementation uses a 7-day half-life: a touchpoint 7 days before conversion gets half the credit of a touchpoint on conversion day.

Advantage: captures multi-touch journeys while still weighting recent interactions more heavily. This works well for ChatGPT ads because it recognizes the initial discovery role without ignoring the closing touchpoint. Disadvantage: more complex to implement and explain.

Position-based (U-shaped) attribution

Position-based attribution uses a 40/20/40 split: 40% credit to the first touchpoint, 40% to the last touchpoint, and 20% distributed across all middle touchpoints. This model explicitly values both discovery and closing.

Advantage: gives proper credit when ChatGPT ads introduce users who later convert through another channel. If a user discovers you through a ChatGPT ad, researches on your site, then converts through a Google brand search, ChatGPT gets 40% credit instead of 0% under last-click.

Recommended path

Start with last-click attribution during your first 60 days on ChatGPT ads. It gives you a conservative baseline and is simple enough that your team can act on the data immediately. After 60 days, when you have enough conversion data to see multi-touch patterns, migrate to time-decay attribution. Time-decay is the best balance of accuracy and simplicity for most advertisers and captures ChatGPT’s role in multi-touch journeys without overcomplicating the model.

Incrementality testing

Attribution models tell you which channels get credit for conversions. Incrementality testing tells you whether those conversions would have happened anyway. This is the gold standard for proving that ChatGPT ads are actually driving new business, not just claiming credit for conversions that were already going to happen.

Geo-holdout tests

Select two comparable markets (similar population size, demographics, and historical conversion rates). Run ChatGPT ads in one market and withhold them in the other. After four to six weeks, compare conversion rates between the two markets. The difference is the incremental lift attributable to ChatGPT ads.

Geo-holdout is the most rigorous incrementality test available. The key is selecting markets that are genuinely comparable. Use historical data to match markets on revenue, traffic volume, and seasonal patterns before starting the test.

On/off tests

A simpler alternative: run ChatGPT ads for two weeks, then pause them for two weeks, then run them again. Compare conversion rates during the on and off periods. Also monitor branded search volume and direct traffic during the off period – if both decline when ChatGPT ads are paused, that is evidence of incremental demand generation.

On/off tests are less rigorous than geo-holdouts because external factors (seasonality, competitor activity) can influence results. Run at least two on/off cycles to increase confidence in the findings.

Survey-based incrementality

Add a “How did you hear about us?” question to your post-purchase or post-signup flow with “ChatGPT” as an explicit option. This captures self-reported influence that digital attribution cannot track – users who saw a ChatGPT ad, remembered your brand, and later converted through a different channel.

Survey data is inherently less precise than controlled experiments, but it is cheap to implement and provides directional signal immediately. Combine survey responses with your attribution data for a more complete picture.

Reporting cadence

Different metrics matter at different time horizons. Checking ROAS daily is noise. Checking CTR monthly is too slow. Here is the cadence that works.

Daily (during learning phase, first 2–4 weeks)

  • Impressions – is your ad being served?
  • Clicks – are users engaging?
  • CTR – is the creative resonating?
  • Landing page visits (GA4) – is traffic arriving properly tagged?

Daily monitoring during the learning phase catches problems early: broken UTMs, landing page errors, or creative that is not generating engagement. Focus on traffic and engagement metrics. Do not make optimization decisions based on daily conversion data – the sample sizes are too small.

Weekly (once campaigns stabilize)

  • Conversions – how many leads, signups, or purchases?
  • Cost per conversion – is the unit economics working?
  • Creative performance ranking – which headline and description combinations are winning?
  • Landing page conversion rate – are certain pages outperforming others?

Weekly is the right cadence for optimization decisions. You have enough data to identify trends without overreacting to daily fluctuations. Pause underperforming creatives and increase budget on winners.

Monthly (strategic review)

  • CRM pipeline attribution – what pipeline value did ChatGPT ads generate?
  • Blended ROAS across channels – how does ChatGPT compare to Google and Meta?
  • Incrementality test results – is ChatGPT driving net-new demand?
  • Creative lifespan – are top creatives showing fatigue?
  • Budget allocation – should you shift spend toward or away from ChatGPT?

Monthly reviews are for strategic decisions: budget reallocation, channel mix changes, and executive reporting. This is where your Looker Studio dashboard and CRM data come together to tell the full ROI story.

Track performance with Lapis

Lapis helps you close the measurement gap from the creative side. When you generate ad creatives through Lapis, every asset comes with UTM-ready export URLs, making it easy to maintain consistent tracking across campaigns without manual URL building.

Lapis also generates creatives for Google, Meta, LinkedIn, and TikTok from the same prompt. This cross-platform creative generation means your ads share consistent messaging and visual identity across channels, which makes your cross-channel performance comparisons more meaningful. When creative variables are controlled, differences in performance metrics reflect genuine channel differences rather than creative inconsistencies.

For forecasting, Lapis provides performance projections across platforms so you can model expected results before committing budget. This helps you set realistic targets for your ChatGPT ads dashboard and avoids the common mistake of comparing a new channel against mature channels without accounting for learning-phase performance.

Try Lapis for free and start building your cross-channel creative library with consistent tracking built in.

For a detailed guide on measuring return on ChatGPT ad spend, read our ChatGPT ads ROI measurement guide. For campaign optimization strategies once your dashboard is live, see the ChatGPT ads optimization playbook. And for a comprehensive overview of the platform, start with our complete guide to ChatGPT ads.

Frequently Asked Questions

How do I track ChatGPT ad conversions?
Since OpenAI does not provide conversion tracking, use UTM parameters on destination URLs, GA4 event tracking for on-site actions, CRM source tagging for leads, and post-purchase surveys with ChatGPT as an explicit option. Together these four layers give you 85-95% visibility into actual conversions and ROI.
What metrics should I monitor for ChatGPT ads?
Daily: impressions, clicks, CTR. Weekly: landing page conversion rate, cost per conversion, creative performance ranking. Monthly: CRM pipeline attribution, blended ROAS across channels, incrementality test results.
How do I build a cross-channel ad dashboard with ChatGPT data?
Use Looker Studio to connect GA4 for ChatGPT traffic (via UTMs), then add Google Ads and Meta via third-party connectors like Supermetrics. Blend data on date and campaign dimensions to create a unified cross-channel view.
How do I set up GA4 to track ChatGPT ad traffic?
Add utm_source=chatgpt to all destination URLs. In GA4, create a custom channel group that recognizes chatgpt as a source. Set up custom events for key conversion actions and build an exploration report that filters specifically for ChatGPT traffic.
What attribution model works best for ChatGPT ads?
Start with last-click for simplicity during your first 60 days. Then migrate to time-decay attribution, which gives more credit to recent touchpoints while still recognizing earlier influences. This captures ChatGPT's role in multi-touch journeys better than last-click.
How do I prove ChatGPT ad ROI to my team?
Run incrementality tests: geo-holdout tests comparing markets with and without ChatGPT ads, or on/off tests pausing ChatGPT for two weeks and measuring the impact on branded search and direct traffic. Combine with CRM pipeline data showing ChatGPT-sourced lead quality.
How often should I review ChatGPT ad reporting?
Daily during the learning phase (first 2-4 weeks), focusing on traffic and engagement metrics. Weekly once campaigns stabilize, evaluating conversion and cost metrics. Monthly for strategic review of pipeline attribution, cross-channel ROAS, and budget allocation decisions.
What tools do I need for ChatGPT ads analytics?
GA4 for on-site tracking, Looker Studio for dashboard visualization, a CRM (HubSpot, Salesforce) for pipeline attribution, Supermetrics or Windsor.ai for cross-platform data connectors, and Lapis for performance forecasting across platforms.

Try Lapis free

Create designer quality, on-brand ads using AI.

Start free trial