The 0.4x multiple tax on bad data



Hey there Reader!

I spent last weekend in Fort Lauderdale with Greg Isenberg and a room full of founders who are quietly building the future.

The topics ranged from taste and human connection to the bleeding edge of AI agents, Claude Code, and open source tooling that most people won't discover for another 12-18 months.

The knowledge shared in that room isn't the top 1% of what's possible with AI right now. It's the top 0.1%.

That comes with a responsibility to put it to work. The gap between what's possible and what most businesses know is possible is wider than I've ever seen. And it's accelerating.

For PE firms and sponsor-backed companies preparing for exits: this is the unlock. AI that actually works, applied to the data and diligence challenges that make or break valuations. The next twelve months will separate the prepared from the exposed.

What an exciting time to be alive, and I've had plenty of chance to put new tools and connections into practice during the big freeze in the mid-Atlantic.

Watch this space!

This Week in Mid-Market PE & Data

Mid-market hesitation giving way to renewed deal appetite for 2026 - Caberhill’s PE survey shows buyers eager to deploy near-record dry powder, but creativity in structuring is now key to bridging valuation gaps

McKinsey: AI and data science teams top of mind for PE investors - Building best-in-class data science teams and AI-enabled value creation initiatives are becoming portfolio-wide priorities as exit backlogs grow

70% of PE firms will use more data in diligence, 68% will use more tech vs. people - West Monroe survey shows the shift from risk mitigation to value creation is driving automation and analytics adoption across the deal lifecycle

Now this week's deep dive - how bad data actually plays out during an exit.

DEEP DIVE

The 0.4x Multiple Tax You’re Paying for Bad Data

Most companies think they have clean data.

They have a modern ERP. Regular financial audits. Dashboard upon dashboards. The board loved the quarterly updates.

Then they hired a banker to prep for exit. Two weeks into diligence prep, the banker asked a simple question: “Can you show me cohort-level retention by customer segment?”

The answer came back five days later. With caveats. And footnotes explaining why certain segments weren’t comparable.

That hesitation cost them 0.4x on their multiple. On a $50M business, that’s $2M left on the table.

The math is simple. The execution is not.

GF Data analyzed 360 transactions and found something most operators miss: sellers who conducted sell-side Quality of Earnings reports alongside data quality assessments commanded 7.4x EBITDA multiples versus 7.0x for those who didn’t.

That 0.4x difference isn’t about having better numbers. It’s about buyer confidence in the numbers you show them.

Because here’s what happens during diligence when your data doesn’t hold up:

Scenario 1: The Revenue Recognition Question

Buyer asks: “Walk me through how you recognize revenue for multi-year contracts.”

Good answer: “Here’s the policy. Here’s the source system. Here’s how it flows to financials. Here are three customer examples showing it in practice.”

Red flag answer: “Let me check with our accounting team and get back to you.”

The good answer takes 20 minutes. The red flag answer triggers three more questions and a special diligence workstream.

Scenario 2: The Customer Cohort Request

Buyer asks: “Show me year-over-year retention by customer segment and acquisition channel.”

Good answer: Segmentation exists. Data is clean. Answer arrives in 48 hours with confidence intervals noted.

Red flag answer: “We don’t track it that way, but we can pull something together.”

Translation: Your data model wasn’t built for this question. The buyer now wonders what other questions you can’t answer.

Scenario 3: The Source System Audit

Buyer asks: “How many systems feed your revenue reporting, and can you show me the reconciliation logic?”

Good answer: “Three core systems. Here’s the data lineage documentation. Here’s the monthly reconciliation process. Last discrepancy was $4,200 in Q2, documented and resolved.”

Red flag answer: “Our finance team handles the reconciliation manually each month. It usually balances.”

That word “usually” just cost you negotiating leverage.

Why this keeps happening

Most mid-market companies build their data infrastructure to run the business, not to sell the business.

That’s rational. Until it’s not.

You optimize for operational reporting - dashboards that tell you if you’re hitting plan, if payroll cleared, if inventory is moving. Those systems work fine for Monday morning leadership meetings.

But exit diligence asks different questions. They want cohort analysis. Customer lifetime value by segment. Revenue bridge detail at SKU level. Margin decomposition that reconciles to GAAP financials.

If you can’t produce those answers cleanly and quickly, buyers do one of three things:

  1. Discount the purchase price to account for integration risk
  2. Structure earnouts to shift risk back to you
  3. Walk away entirely if the data raises too many flags

All three options cost you money.

What good looks like

I’m not talking about perfect data. That doesn’t exist.

I’m talking about defensible data. Data you can stand behind during questioning. Data that comes with lineage, documentation, and confidence.

Here’s the practical test: If your banker asks your finance team a detailed analytical question, how long does it take to get an answer you’d bet your valuation on?

If the answer is 48 hours or less, you’re probably ready.

If the answer is “let me check with IT” or “we’ll need to pull that manually,” you’re not.

The work that matters

Three months before you talk to a banker, run your own diligence:

1. Audit your source systems Map every system that feeds financial reporting. Document the connections. Know where the gaps are before a buyer finds them.

2. Test the cohort questions Ask your team to produce the analysis buyers will request: customer retention, revenue bridges, margin decomposition, cohort-level economics. If they struggle, fix it now.

3. Document your data lineage Show how data flows from operational systems to financial reporting. Include reconciliation logic. Note any manual steps. This becomes your answer to “how do you know these numbers are right?”

This work isn’t glamorous. It doesn’t show up in EBITDA. Your leadership team won’t applaud you for it.

But it protects your multiple. And that’s worth more than most operational improvements you could make in the same timeframe.

The question that matters

Six months from now, if a buyer asks your CFO to explain a variance in your customer retention data, will they have the answer in 48 hours?

If not, you’re paying the 0.4x tax. You just don’t know it yet.

FINAL SEND OFF

Thanks for reading! If this resonates or if there’s something you think we can help with, just reply to this email. We read everything that comes through.

Graeme

Graeme Crawford

CEO at Crawford McMillan

Helping PE firms protect and grow company valuations with clean, reliable data.

CRAWFORD McMILLAN
Professional Data Consultancy of 25+ years

The content provided in this newsletter, including discussions regarding data architecture, operational efficiency, valuation readiness, and business strategy, is intended strictly for informational and educational purposes only. Decisions regarding capital allocation, investments, acquisitions, or business strategy should always be made in consultation with qualified professional financial, legal, and investment advisors.

600 1st Ave, Ste 330 PMB 92768, Seattle, WA 98104-2246
If you wish to change the emails you receive, please visit Preferences. If you wish to no longer receive communication via the newsletter, please Unsubscribe here.