Why Data Drives Marketing
Marketing without data is expensive guesswork. Marketing with data is a system you can improve. This module establishes why measurement matters — and how the best marketers think about it.
The company that doubled revenue by measuring less — more carefully
In 2021, a SaaS company was tracking 47 different marketing metrics across 6 dashboards. Every Monday, the marketing team presented a 30-slide deck full of graphs. Impressions, clicks, opens, followers, share of voice, bounce rate, dwell time, engagement rate, reach, frequency — all of it.
The CEO made a decision that felt counterintuitive: cut the dashboard to four numbers.
- Monthly qualified leads (MQL)
- Cost per MQL
- MQL-to-customer conversion rate
- Customer acquisition cost (CAC)
Everything else was audited against a simple question: Does this number help us make a decision we couldn't make without it? If not, it was removed.
Six months later, the team had stopped optimising for vanity metrics. Budget had shifted from channels that looked impressive to channels that generated the four numbers. Revenue doubled.
More data isn't better data. The right data, measured consistently, is.
The measurement hierarchy
Not all metrics are equal. They exist in a hierarchy of usefulness:
The trap: Most marketing teams spend 80% of their time looking at activity metrics at the bottom — the ones easiest to see, easiest to move, and least connected to business results.
The discipline: Work from the top down. Define your business outcome. Identify which marketing outcomes drive it. Track channel metrics only to diagnose why outcomes are moving. Measure activity metrics only when they're connected to channel metrics through a proven chain of causation.
The analytics mindset
There are two ways to use marketing data:
Reporting: Describing what happened. "Our email open rate was 34% last month."
Analysis: Understanding why it happened and what to do about it. "Our email open rate dropped 8 points when we increased send frequency from weekly to twice-weekly — and revenue per email also dropped. The data suggests we're trading quality of engagement for volume of sends. We should revert to weekly and monitor."
Most marketing "analytics" is actually reporting. True analysis generates hypotheses, tests them, and informs decisions. The distinction matters because reporting creates work without creating insight.
The five questions of marketing analysis:
- What happened? (Reporting)
- Why did it happen? (Diagnosis)
- What will happen if we do nothing? (Forecast)
- What should we do? (Recommendation)
- How will we know if it worked? (Measurement plan)
A genuine analysis answers all five. A report answers only the first.
The data quality problem
Bad data is worse than no data. It creates confidence in wrong conclusions.
Common data quality failures in marketing:
Attribution gaps: A customer clicked a Google ad, later saw a Meta ad, then searched the brand name and purchased. Google Ads, Meta Ads, and Google Analytics may each claim the sale. The total "revenue" across platforms might be 3× actual revenue — not because each is lying, but because each is telling its own version of the truth.
Tracking gaps: An estimated 10–25% of website visitors may be invisible to standard analytics for typical audiences — tech-savvy or European consent-required audiences can reach 30–40% (per various industry analyses including Ahrefs and browser privacy reports, 2022–2023; verify as privacy defaults continue to evolve) — due to ad blockers, privacy settings, and iOS restrictions. Your "total users" in GA4 is an undercount.
Definition inconsistency: Your marketing team calls someone a "lead" when they fill in any form. Your sales team calls them a "lead" only if they've expressed purchasing intent. Six months of "lead" data is meaningless without a shared definition.
Metric gaming: Once a team is measured on a metric, they find ways to improve it without improving the underlying reality. "Impressions" can be gamed. "Unique visitors" can be inflated. "Leads" can be padded with low-quality submissions.
The solution: Define metrics precisely, measure them consistently, and audit your tracking stack quarterly. Data you can't trust is worse than no data.
There Are No Dumb Questions
"How do I know which metrics to track?"
Start from the business goal and work backwards. If the goal is "acquire 100 new customers this quarter at a CAC under £200," the metrics you need are: customers acquired, total marketing spend, and CAC. To understand why CAC is where it is, you need channel-level conversion data. To optimise individual channels, you need campaign and creative metrics. The decision drives the metric — not the other way around. Don't track something just because your analytics tool offers it.
"How often should I look at my marketing data?"
It depends on the metric. Daily for anomaly detection (sudden spend spikes, deliverability drops). Weekly for performance review (channel KPIs, campaign metrics). Monthly for strategic analysis (trend analysis, budget allocation). Quarterly for business outcomes (CAC trend, LTV development, ROI by channel). Looking at monthly trends daily produces noise. Looking at daily metrics monthly loses the ability to catch problems early.
Vanity vs. Actionable Metrics
25 XP2. Cost per qualified lead: £42 →
Building your measurement plan
Before running any campaign or publishing any content, define your measurement plan:
Example measurement plan — email nurture sequence:
- Objective: Convert trial users to paid subscribers
- Success metric: Trial-to-paid conversion rate
- Diagnostic metrics: Email open rate, click rate, revenue per email, unsubscribe rate per email
- Tracking setup: Email platform revenue tracking linked to subscription system; UTM parameters on all CTA links
- Review cadence: After each email in sequence completes 14-day window; sequence performance reviewed monthly
Build Your Marketing Measurement Plan
25 XPBack to the SaaS team
Six months after cutting from 47 metrics to 4, the Monday morning deck was gone. In its place: a single shared dashboard with four numbers, updated automatically.
The conversation changed. Instead of "our Instagram engagement was up 14%," it became "MQL cost is up 23% this month — where did the shift happen?" That's a question you can act on.
Revenue doubled not because the marketing got better overnight, but because the team stopped optimising for metrics that didn't connect to outcomes. Cutting the dashboard was the first real marketing decision they made.
Key takeaways
- The measurement hierarchy separates signal from noise. Business outcomes → marketing outcomes → channel metrics → activity metrics. Work top-down; don't let activity metrics substitute for business outcomes.
- Reporting and analysis are different things. Reporting describes what happened. Analysis explains why and recommends action. Most marketing "analytics" is reporting with a chart.
- Bad data creates confident wrong decisions. Attribution gaps, tracking limitations, and inconsistent definitions make bad data common. Audit your data quality before you trust it.
- Define success before the campaign starts. A measurement plan written in advance forces clarity about what you're trying to achieve — and gives you something honest to measure against.
- Track fewer things, more carefully. The SaaS company that halved its metrics and doubled revenue is not an outlier. Clarity beats comprehensiveness.
Knowledge Check
1.A marketing director reports that the team's social media following grew 40% this quarter, email open rates are up 12 points, and the blog received 80,000 page views. The CEO asks: 'How many new customers did we acquire?' The director doesn't know. What does this illustrate?
2.An e-commerce brand's Google Ads reports £85,000 in revenue this month. Meta Ads reports £62,000. GA4 reports £94,000 total revenue. The actual revenue from the payment processor is £97,000. What explains the discrepancy between platform figures and actual revenue?
3.A marketing manager is presented with the following data for a campaign: 2.1M impressions, 18,000 clicks (0.86% CTR), 900 form submissions (5% conversion rate), 45 sales conversations booked (5% submission-to-meeting rate), 9 customers closed (20% meeting-to-close rate). Which metric most directly measures marketing success?
4.A content team is tasked with improving the company's 'engagement rate' as their primary KPI. After six months, engagement rate has improved 40% — but the sales team reports that the quality of inbound leads has declined. What is the most likely cause?