Over the years, I’ve written and spoken a lot about reporting, data analytics and business intelligence, but this time, I want to pull together some of the most common reasons I see reporting projects fail before they even make it to the screen. These insights come directly from my personal experience, both past and new, across organisations of all sizes.
So, this is not about technical issues or challenges. It’s about the things we often overlook at the very start, those things that if left unchecked from the beginning, mean the succeeding with the solution was always going to be an uphill battle.
No Clear Purpose or Business Question
I find it frustrating how often I see this happen. In so many of the BI and analytics projects I get involved in, there’s a clear pattern: we skip straight to the data, the visuals or the tech. We take the high-level request and run with it. We don't pause to dig deeper into what the business actually needs, what are we actually doing.
And look, I’m not here to kill the momentum. That energy is a good thing. But we’ve got to focus that drive. We need to stop for a moment and ask, what’s the actual purpose of this report? where are we trying to go and why?
Because let’s be honest, we are about to invest serious time here. This will take multiple days, across multiple people. We might have to pause other workstreams, shift resources, commit budget. So before we get too far ahead, let’s ask the basic but critical question:
Does this BI solution help us increase revenue? Reduce cost? Mitigate risk? Stay compliant? Work more efficiently? Something!? Anything?!
I’m not saying we need to change the world, but even if it’s a simple report it should still need a clear purpose. Who’s using it? What’s it helping them achieve?
If we can’t tie it back to a real business outcome, we risk building something that looks great… but delivers very little and ends up not being used. Hence, failing before we begin. Many past projects come to mind (specifically one Loyalty Solution) where we rushed straight into development without ever stopping to ask why we were doing it and sure enough, once the solution was delivered, the questions started coming in... Why isn’t it being used? What was the point?
When I build a new reporting solution, I always start with the business goal we are trying to achieve or the business challenge we are trying to overcome. Not the KPIs, not the data and certainly not the tech or visuals. It makes life a lot easier if you have a clear purpose as it helps you focus in on the important stuff and not everything. An example are the key performance indicators (KPIs). If you try to gather them before understanding the goal or challenge, you’ll get flooded with everything. They should evolve from the purpose, that way you will be more focused, more selective.
I once had someone say to me “But what if this process makes us realise we don’t even need the report? Doesn’t that mean less work for you?” It did make me laugh, but here’s the truth... if that’s the outcome, we’ve already delivered business value. No report is better than a pointless one. We save people’s time, avoid unnecessary spend and focus our energy where it actually matters.
To add to that though, every organisation needs data. But what they often don’t have is the right conversation early on. The approach I follow usually sparks discussions that hadn’t happened before. We might realise the original report isn’t needed, but we take a step back, rethink where we're trying to go and come back with a sharper focus and a much stronger purpose.
BI Projects Fail Because Technology Gets All the Attention
Another major reason BI reports fail before they’re even built is this due to the amount of focus on the tech.
By the way, I do get it... we all love new tools, features, AI capabilities, performance tricks, and talking about the latest shiny thing in the Microsoft ecosystem. And yes, these things matter, they really do. But in so many projects I’ve seen, the technology becomes the star of the show. We obsess over pipelines, models, data visualisation and forget that none of it matters if the report doesn’t actually help people make better business decisions. Isn't that what business intelligence is? To improve the decision making process? To be data-driven?
What’s the point of investing the big £££ in a data platform, only to produce reports that don't deliver at the end. Where is the ROI?!
The reality is, tech is the enabler, not the goal. You can automate manual processes, centralise data, write incredible DAX… but if we don’t understand how any of that impacts the business, how it helps someone do something better haven't we missed the point...
I previously blogged about this, but think its super relevant here. I remember a conversation with a group coming from different industries (retail and manufacturing). We were talking about the benefits of implementing a data solution. Some answers? “Automating manual reports” and “consolidating all data”. Good improvements, no doubt. But those are technical improvements, not business benefits. No doubt they improve efficiency, but the real question is "what impact does that efficiency have on the business?". So, my mind goes to the below:
- In retail, can we increase profit by reducing stockouts or optimising inventory across stores?
- In manufacturing, are we cutting downtime and preventing equipment failures with better insights?
- In healthcare, can we improve patient outcomes by better allocating resources and avoiding unnecessary treatments?
- In finance, are we actively reducing fraud risk or avoiding costly compliance penalties?
Even though this blog is relevant to all BI vendor applications, I will of course stick to a Power BI example:
- The Tech-First Approach: A retailer asks for a sales performance dashboard. The BI team asks “what do you want” and “how do you want to see it”. From this, they build a report showing total revenue, profit margins and product performance. But when the business users see it, they ask: “How does this help us hit our business objectives?”
- The Business-First Approach: A retailer asks for a sales performance dashboard. Instead of starting with Power BI, we have a structured conversation with the audience, so the actual end-users of the solution. We ask tailored questions to derive answers to “where are you trying to go”, “why are you trying to go there”, “how do you know if your moving towards it”, “if you aren’t, what might be causing it”, “what actions can you take to fix it”, “what’s the business process to do this”. Notice… no tech talk! So, we discover their real drive. The result? A solution that not only shows figures but also aligns to the business’ core objectives and challenges, therefore driving real action.
The Wrong Audience In, the Wrong Report Out
Here’s another reason BI reports fail before they’re even built and its a big one. We don’t involve the right people and we try to serve too many at once. Let's break it down.
I’ve written about this before (many times), but it deserves its place in this blog. It’s one of the biggest and most consistent reasons reporting projects fall apart in my opinion. How can we expect to build a solution that actually delivers value if we don’t involve the people who are supposed to use it?
Think about it, if I told you I was building a product, but not involving any of the people who’d actually use it… you’d think I was mad. Yet for some reason, when it comes to reporting and dashboards, this is treated like a normal way of working. It’s not.
Instead, we get told things like:
- "We know what the end users need"
- "They’re too junior to involve"
- "They’re too new"
- "They are tied down to other business priorities"
- "We cant align calendars"
But here’s the problem, the people who claim they can give requirements on behalf of the actual audience often don’t understand the day to day. They might have a high-level view, sure. But they’re not the ones dealing with the real challenges or trying to hit specific targets. They don’t fully know what the users need to see or do.
But, this isn't just about involving them for collecting requirements. Involving your audience builds TRUST and trust is everything when it comes to adoption. I can’t tell you how many conversations I’ve had where someone says their users don’t trust the data, the tool or the team behind it… and every time I dig deeper, I find many times the users were never brought into the process.
And that’s just one side of the coin. The other issue? Trying to serve too many audiences in a single solution. Now, to be clear, I am talking about reports and dashboards.
I’ve been doing this long enough to spot such reports instantly, the ones trying to be everything to everyone. You’ve seen them too, cluttered layouts, no white space, multiple scroll bars, conflicting KPIs. It’s a mess and it happens because we try to capture requirements from different audiences that do not have a common purpose.
If Sales wants store-level performance and Inventory wants to minimise stockouts, those are two separate conversations. Trying to force them into the same room to gather shared requirements, well you are setting yourself up for failure. Not saying you cant have the same KPIs on a single canvas, but that's an entirely different blog. What will happen? One team usually dominates, the other leaves frustrated and you (and the report) end up taking the blame.
So, what do we do instead? Identify the true audience of the solution right from the start. If there are multiple audiences, don't just group them all together for gathering requirements. Facilitating one workshop with a group of people with a common purpose is hard enough. Speak to the end-users, keep them involved, build trust.
Asking the Wrong Questions = Building the Wrong Report
Okay, so we’ve got the right people in the room. Now what? The next mistake? We ask the wrong things. Even if the audience is involved, the wrong line of questioning means you're still heading toward a report that misses the mark before development even begins.
Too often, we fall back on surface-level questions like:
- What do you want to see?
- What charts would you like?
- Have you got an existing report?
Not saying the above questions always fail, but from running many data storytelling and requirements gathering workshops to gather the right requirements, this usually leads to conversations around tech, features, visualisation, branding, etc. instead of a solution to a problem.
We need to go deeper:
- What decisions are you struggling with right now?
- What does success look like for you?
- What actions can you take based on this information?
- How do you know if your moving towards your goal?
- If a KPI is in the red (or the green), what is the likely cause?
These questions shift the conversation. Instead of asking what they want, we explore why they need it. And when you get that part right, the rest of the design becomes so much easier because you're not just creating a report, your offering real value. We want to create solutions that offer actionable insights.
On the topic of asking questions, I must emphasise the important of feeling comfortable to ask WHY! I promise, if you are working with a good bunch of people, they would appreciate you asking these questions. If an organisation is looking for real value, solutions that make a difference they don't want someone to just come in and accept everything. So, feel comfortable with asking why and challenging the initial request. Also, do ask users to repeat something if it was not understood.
Drive Adoption - Not Thinking About the After
We spend weeks if not months, working on the the perfect solution and then what? We publish the report, tick the box and move on. But what about the people who actually have to use it? What support do they get? What guidance do they have? How do we know it’s even being used in the right way?
In reality, this is the time to get very much involved. Put yourself in your end-users shoes, think of it from their perspective. In fact, you should be doing this right from the beginning and this will promote you to consider helpful experiences into the solution. For example, home pages with helpful information, a data dictionary, on-screen experience (what I like to call guided learning), Power BI Template files. All these may seem small at first, but they offer that extra guidance and confidence needed by the end users.
Beyond that, we should be thinking about adoption and familiarisation right from the start. Here are some additional items I would recommend. Please note, this is not a full list, but some I feel most relevant.
Don’t pull the plug on old the old solution
When you are rolling out a new solution, some resistance is completely normal, especially from users who’ve been doing things the same way for 1 year, 3 years 5+ years. Who wouldn’t feel uneasy? That’s why it’s so important to talk to your end users early and often. Reassure them. Listen to them. The earlier you build that trust, the easier the transition will be.
I remember supporting a utilities company in the UK a few years back. There was huge pushback against Power BI and everyone assumed it was just down to the users being “too tied to their old platform.” But when I sat down with them and actually listened, the real issue surfaced... they believed Power BI couldn’t do what their existing tool could. But it could, the capabilities were there, the conversation just hadn’t been had.
Make go-live exciting (and definitely don't forget it)
Once the solution’s ready, don’t just quietly publish it and move on... make go-live an actual moment. Bring people together and show the value. I always tell my clients, this is the time to BUILD EXCITEMENT!
But to be clear, go-live isn’t the end, it’s just the start. Adoption is a journey and learning doesn’t stop the moment a report is published. Use the go-live session to walk users through the report, of course, but also what support looks like after launch. Things like drop-in clinics, your internal Power BI community and portal, where to ask questions and the helpful features we built into the solution like guided learning (on-screen tutorials), data dictionaries, template files and more. These sort of sessions genuinely make people feel comfortable in reaching out to ask questions and provide feedback.
Offer end-users the needed training
Another reason so many Power BI reports fail before they’re even built? We plan for delivery, but not for upskilling the users. It’s shocking how often training is treated as optional, a “nice to have” at the end of the project. The reality is even the most well-designed solution will stall if end-users don’t know how to use it or don’t feel confident engaging with it. Also, expecting people to “figure it out” with free resources while juggling meetings, deadlines and their day jobs is unrealistic. The organisation must carve out the necessary time for the end users... I cant stress this enough. Training isn’t just about upskilling, it’s about building trust, driving usage and making sure the solutions you spent weeks creating don't end up being ignored - it plays a role in successful BI. We must empower the end users.
Set up a Business Intelligence Community Portal
We don’t create the support ecosystem users need to thrive. Now this somewhat aligns to the above points, but a well-structured Community Portal is a powerful tool you can build to support the adoption of any BI initiative long-term. It becomes the go-to space for users to find answers, share knowledge and stay engaged, all without relying on the core reporting team for every little question. It helps fill the gap in for the lack of support. I do this regularly for Power BI, create a Power BI community portal. It’s not just a dumping ground for files, it’s a place that creates a sense of belonging. When you provide access to FAQs, starter templates, data access guidelines, video walkthroughs, recordings of internal events and clinic schedules, you make it easier for users to navigate Power BI confidently and you promote a culture where people use the platform in the right way.
Conclusion
If there’s one thing I want you to take away from this blog, it’s this... many BI project fail long before they ever reach development, not because of broken data or poor BI tools, but because the foundations aren’t there to begin with. We rush in without a clear purpose. We rely on assumptions instead of engaging the real audience. We ask shallow questions that lead to surface-level answers and too often, we forget that adoption doesn’t happen automatically. It needs to be planned just like the report itself or any other data initiative.
It’s easy to get swept up in tools, features, and dashboards. But if we don’t pause to focus on the fundamentals, the business goals, the people using the solution and the decisions we’re trying to support then the best we’ll deliver is just another report that looks nice… and gets ignored. Try some of the above and I am sure it will lead you further down a successful BI project.