When a new client engagement starts with "we need better dashboards," I know we are going to spend the first month not building dashboards.
The dashboard trap
Dashboards feel like progress because they are tangible. A row of sparklines and a big number in the top-left gives you something to point at in a meeting. But most of the time, the dashboard problem is a symptom of a deeper problem: the team does not know what questions matter.
You can tell because of how these projects go. The CEO wants a dashboard. The analyst builds one with 40 metrics. The CEO says it is too busy. The analyst builds one with 10. The CEO says it is not detailed enough. Eventually someone declares it "good enough" and the dashboard gets opened twice in the first week, then never again.
What happened? The team did not actually have a decision that the dashboard was supposed to inform. Without a decision, every metric feels equally important, which is the same as saying no metric is important.
The question-first approach
Every dashboard project should start by answering: "What decision will this dashboard change?" If you cannot name a specific decision — "whether to increase paid spend next month," "whether the new pricing is working," "whether to cut a product line" — you are not ready to build a dashboard. You are ready to have a harder conversation about what you are trying to run.
Once you know the decision, the dashboard design flows from it. You need the 3-5 signals that actually inform that decision, and nothing else. The sparklines and stat cards are filler; the decision is the product.
What most dashboards should be instead
For most operating questions, a dashboard is the wrong artefact. Better alternatives:
- A weekly written update. Three paragraphs explaining what moved, why it moved, and what you recommend. More effective than any dashboard because it forces interpretation, which is where value lives.
- An alert. If you only care when metric X crosses threshold Y, you do not need a dashboard — you need an email.
- A one-time analysis. Many "ongoing monitoring" requests are actually one question the person wants answered once. Answer it, and free up the 40 hours it would have taken to build a self-service dashboard.
When a dashboard is actually right
Dashboards genuinely make sense for a small set of use cases: real-time operational monitoring (your ad ops team needs to see spend vs budget right now), recurring decisions with stable structure (weekly promotion performance review), and compliance/audit needs (a record-keeping requirement).
Everything else should probably be a written update, an alert, or a conversation.
The best analytics teams we work with build fewer dashboards than their peers and have higher impact because of it. They have reoriented around producing decisions, not producing views. It is a small framing change with outsized consequences.