- The Dashboard
- Posts
- If Your Dashboard Looks Impressive, It’s Probably Useless
If Your Dashboard Looks Impressive, It’s Probably Useless
The messy systems you already use have more information value than your new tech stack.
The House You Regret Building
Every D2C/info product brand that's survived past $10M has analytics/reporting that works.
Maybe it's messy. Revenue checked in Shopify. Ad spend pulled from Facebook. Creative performance tracked in spreadsheets.
But it works. They've used these same numbers to grow from zero to hero.
Evidence of information value they tracked is self evident in their growth.
Then new genius CMO (fractional of course) says, "We need proper analytics."
And here's where everything goes wrong. For company of course.
Instead of asking "What made our current system work?" we ask "What should a modern analytics stack look like?"
Instead of building on top of what works, we chase what looks sophisticated.
Instead of enhancing what got us here, we try to replace it entirely.
Six weeks later, beautiful dashboards are built. New tools integrated and paid for...
The decision maker takes one look and says: "This isn't what I wanted."
And they go back to checking Shopify every morning. While still paying that bill for analytics tools no one use.
The Fundamental Mistake
I mean these guys are not idiots, try building business and find for yourself. I have ultimate respect for these guys.
But we are all suckers at some level? Aren't we?
How they say? Biggest suckers for marketing are marketers.
It's expert paradox, i mean he is Fractional CMO that worked for 28, 11 figures startups.
How could he not know better?
So we build what looks beautiful, what make our business look more complex, what our peers would be amazed with... not what had information value.
Those messy spreadsheets? They tracked metrics with proven information value — numbers that actually changed decisions and drove the business to $10M.
The new dashboard? It tracks everything that's technically trackable. CAC, LTV, cohorts, attribution models. Impressive? Yes. Useful? Give me a break.
What We Forgot About Measurement
To understand why this keeps happening, we need to understand what measurement actually is.
Douglas Hubbard, author of How To Measure Anything, defines it beautifully (it feels really weird to say that definition of measurement is beautiful): "A quantitatively expressed reduction of uncertainty based on one or more observations."
Not perfection. Not elimination of all doubt. Just reduction of uncertainty.
And here's the kicker: That reduction only has value if it changes a decision.
Hubbard calls this "information value" — the economic worth of reducing a specific uncertainty. It's calculated by asking: If I knew this number, what would I do differently? And what's the value of making that better decision?
In his research, Hubbard discovered something startling: "Most of the variables in a business case had an information value of zero." They don't actually influence any decision.
The Vanity Metrics Trap
Think about your own business:
If CAC doubled tomorrow, what specifically would you do?
If your attribution model showed different channel mix, what would you shift?
If that cohort analysis revealed something surprising, what decision would change?
If the answer is "nothing" or "I'm not sure," then you're looking at zero information value.
Now think about those messy metrics that got you to $10M:
That Shopify revenue check every morning? It tells you if yesterday's changes worked
That Facebook ad spend checked manually? It triggers immediate budget decisions
That creative performance spreadsheet? It shapes tomorrow's creative brief
Ugly? Yes. But each one reduces uncertainty about a specific decision you make daily.
The sophisticated analytics we built to replace them? They might show perfect attribution across 17 touchpoints. But if that doesn't change how you allocate budget tomorrow, it has zero information value.
We threw away measurements that worked for measurements that looks good.
The Function We Forgot
Measurement exists to serve decision-making under uncertainty.
Every measurement should reduce uncertainty about a specific decision. Not all uncertainty — just enough to make a better choice than you would without it.
Your messy system understood this intuitively. Each spreadsheet, each manual check, each "unprofessional" tracking method evolved because it answered a specific question that mattered.
The modern analytics stack? It answers every question except the ones you actually ask.
The Workflow Reality Nobody Talks About
Here's the funny irony: Even if those sophisticated metrics HAD information value, you probably couldn't measure them anyway.
Because measurement doesn't happen in a vacuum. It depends entirely on how your business actually operates.
Want to track creatives based of emotions they try to elict? Someone needs to tag creatives with emotions. Want clean LTV? You need consistent customer IDs. Want multi-touch attribution? Every touchpoint needs proper tracking. Cross-domain too!
When we design "proper analytics," we assume these workflows exist. We assume systematic tagging. We assume clean data. We assume consistent tracking.
But here's what actually exists:
Creative names like "v2_final_FINAL_actualfinal_steve_approved"
Customer records split between Shopify, email platform, and that CSV from 2019
Campaign tracking that sometimes has UTMs, sometimes doesn't
You can't measure the ROI of creative emotions if no one will ever tag emotions. You can't calculate accurate LTV if customer data is chaos. You can't show attribution if half your campaigns aren't tracked.
The sophisticated dashboard isn't just solving the wrong problem — it's assuming a company that doesn't exist.
The Expensive Discovery Pattern
Here's when you discover these workflow problems:
With third-party tools like Triple Whale or Hyros: Week 1: Onboarding and integration setup. Week 2-3: Learning the interface and platform. Week 4: Try to map your data to their model. Week 5: Discover your workflow doesn't match their assumptions. Week 6: Support tickets, workarounds, duct tape solutions. Week 7: Half the features don't work because your data doesn't fit
You pay monthly fees to discover the tool assumes a workflow you don't have.
With in-house engineering: Week 1-2: New hire onboards, learns your stack. Week 3-4: They design the "proper" data model. Week 5-6: Build beautiful pipelines and dashboards. Week 7: Connect to real data. Week 7: Discover the business doesn't operate how they assumed Week 8+: Endless revisions that never quite match reality
You invested in headcount to build something technically correct but practically useless.
With agencies: Week 1: Discovery calls and requirements gathering. Week 2-3: Build the pipelines. Week 4: Deploy and connect real data. Week 5: Discover nothing matches the assumptions. Week 6: Scramble to fix what can't be fixed Week 7: Deliver a dashboard that works perfectly... for a company that doesn't exist
You paid for expertise to learn that your workflow doesn't support the analytics you thought you wanted.
The pattern is always the same: You discover the mismatch between analytics and workflow only after significant investment — in time, money, or both.
The Fix: Test Before You Build
What if you could discover all these problems in Week 1?
What if before writing a single line of code, before building any pipelines, before paying for any tools, you knew exactly what would and wouldn't work?
You can. But it requires flipping the entire process.
Build the Model Before the House
Instead of building real pipelines and hoping they match reality, we build fake dashboards that reveal reality.
Here's how:
Week 1: Start With What Works
First, we map what actually exists. Not what should exist. What does.
Those Shopify checks every morning? Document them. Those spreadsheet whole comapny use every day? Understand them. Those manual processes? Map them.
Every messy system that got you to $10M contains wisdom. It evolved for a reason. It serves a specific decision. Don't dismiss it — decode it.
Learn the business bottom up, but not with epistemic arrogance of up down.
Week 2: Build the Future with Fake Data
Now we build your entire dashboard. Every chart you think you want. Every metric you've dreamed of. Every filter and breakdown.
But here's the twist: It's all fake data.
The dashboard looks real. It feels real. You can click through it, filter it, interact with it. But the numbers are made up. Simulated. Fictional.
Here's what most people don't realize: Building a dashboard is easy. A competent analyst can mock up a complete dashboard in few hours. The charts, the filters, the visualizations — that's the simple part.
What's hard? Building the pipelines to feed real data into it. The ETL processes. The data cleaning. The API connections. The transformation logic. That's where 90% of the time and cost goes.
But we don't do any of that. Yet.
Instead, we populate your dream dashboard with synthetic data that behaves like real data would. Same patterns, same relationships, just fictional numbers.
This is like an architect's 3D walkthrough. You can experience your future analytics before wasting weeks on pipeline engineering.
We spend few days on what usually takes few weeks — because we skip the expensive part until we know it's worth building.
Week 3: The Truth Surfaces
You click through your "dashboard." You try to use it like you would in real life.
And suddenly, reality hits:
"Wait, this creative emotion breakdown... who's going to tag emotions?" "This customer journey view... our data doesn't connect like that." "This attribution model... half our campaigns don't have tracking."
Every workflow gap — the ones that normally surface after six weeks of engineering — appears immediately.
But here's the difference: You haven't built anything yet. These aren't expensive fixes to production systems. They're just notes on a prototype.
Week 4: The Negotiation
Now comes the valuable conversation:
"To get this metric, your team needs to change how they tag creatives. Will they do that?" "To track this properly, we need consistent UTMs. Can we enforce that?" "To calculate accurate LTV, we need to clean up customer data. Is that worth it?"
Sometimes the answer is yes — the information value justifies changing the workflow.
Sometimes it's no — the workflow is sacred, so we adjust the analytics to match reality.
Sometimes it's "let's start simple" — we build what's possible now and add complexity only after proving value.
This conversation happens BEFORE building anything, not after wasting weeks of time and money.
Why This Works: Observable Consequences
Hubbard teaches us that if something matters, it must have observable consequences.
Our fake dashboard forces this question early: "If this metric changes, what do you do differently?"
If the answer is concrete and valuable, we've found real information value. We build it.
If the answer is vague or "nothing," we've identified vanity metrics. We skip them.
If the answer is "we would if we could track it properly," we've surfaced a workflow gap. We fix it or work around it.
Every decision gets made with clarity, before any code is written.
The Result: Analytics That Actually Works
When you build this way:
You keep what works: That Shopify check stays. We just make it easier.
You add only what's valuable: New metrics only if they change decisions.
You match reality: The dashboard fits your actual workflow, not some theoretical ideal.
You avoid waste: No abandoned dashboards. No unused features. No wasted pipelines.
Most importantly: The decision maker doesn't say "This isn't what I wanted" because they already approved exactly what they're getting.
The New Rules
Never build pipelines before prototypes
Never build prototypes before understanding what already works
Never assume workflows that don't exist
Start with information value. Test with fake data. Build only what survives contact with reality.
The Resolution
Remember: Every business that survived to $10M already has measurements that work.
Don't throw them away for vanity metrics. Don't assume workflows that don't exist. Don't build before testing.
Because in the end, the best measurement system isn't the most sophisticated one.
It's the one that actually reduces uncertainty about decisions you make every day.
And you already have the seeds of that system. You just need to nurture it, not replace it.