“An expert is a person who has made all the mistakes that can be made in a very narrow field.” - Niels Bohr  

Listen below to learn the pitfalls of dashboard design from Blue Margin's data engineers. 

The Goal of BI Dashboards 

Over 11 years of business, working with 200+ clients in our narrow field of expertise, we’ve made and observed most of the mistakes companies make when designing dashboards. The ultimate goal of automated dashboards is to create company-wide accountability that empowers employees and powers the value creation plan. Therefore, business intelligence dashboards that fail to drive changes in behavior or impact business outcomes will produce mediocre results.  

“Anyone can watch some YouTube videos and learn how to create a report in Power BI. But creating a report that actually improves outcomes takes a lot more training and experience.”    - Brick Thompson, CEO, Blue Margin

 

How to Build an Effective Power BI Dashboard 

For dashboards that move the needle, avoid: 

  • “Order taking” – The most common approach for report development is to interview leaders on what metrics matter most, then build a report accordingly. Order-taking almost always results in ongoing iterations, what we call “tail-chasing”. Instead, report developers should be more consultative, understanding the goal of the dashboard, and the users’ natural process or “narrative” for making decisions, then build a dashboards to represents that narrative. The result is adoption and impact. 
  • Distracting “bells and whistles” – Regression lines, sparkles, and graphic design extras may end up detracting from the report’s intuitiveness and hinder user-adoption. (Read our white paper on the topic)
  • Failing to apply data dimensions – large, flat tables make it difficult to create metrics, connect data, and build reports. The data model should conform to the business model. 

Listen to our podcast episode, How Not to Build a Dashboard, below for more insight.  

Additional Resources

New call-to-action


Full Transcript 

Brick Thompson: 0:04 

Welcome to The Dashboard Effect Podcast. I'm Brick Thompson 

Caleb Ochs: 0:07 

And I'm Caleb Ochs. 

Brick Thompson: 0:08 

Hey Caleb. So interesting angle we're going to do today on this one. The title of our podcast today is How Not to Build a Dashboard. And I think the premise here is, anyone can watch some YouTube videos, and learn how to create a report, say in Power BI. But creating a good report that actually improves outcomes takes a lot more training and experience. And so we're going to cover some of the things to watch out for, things not to do. 

Caleb Ochs: 0:40 

So don't do these things. 

Brick Thompson: 0:41 

Exactly. Okay. What's the first thing not to do. 

Caleb Ochs: 0:47 

First thing, this is high level, and this touches a lot of aspects of the report, is don't just take the order, 

Brick Thompson: 0:54 

Right. Yeah. Go ahead. You were going to finish your thought there. 

Caleb Ochs: 0:59 

You don't work at McDonald's. 

Brick Thompson: 1:00 

(Laughs) Okay, so by "don't take an order", this is actually a term we use around our office a lot, that means don't just accept sort of at face value, superficially, a request from someone for a report. So as an example, a business person comes to you and says, "What I need is a page with a pie chart, and it'll have 20 slices. And then over to the right, it'll have a couple of couple of cards that give me data. And when I click on one of those cards, it'll take me to a whole other report." Now, it's possible, that's a good report design, but I tried to make it bad. (laughs) 

Caleb Ochs: 1:41 

Yeah, that's not good. 

Brick Thompson: 1:42 

I'm gonna guess it's not going to meet the goal well. And so rather than take an order, we try very hard, and actually train our people, to focus in on what's the goal of the report? What questions is the business user going to be answering with that report? And what behaviors are they hoping will come out of the use of that report? 

Caleb Ochs: 2:06 

Yeah. I mean, you're in one way or another, you're paying for this report, right? So you want to make sure that it's worthwhile, and going to give you some sort of benefit. And if you're just doing a pie chart, and just because it looks cool, or maybe that's what you're used to, it's not going to, that probably won't get you there. 

Brick Thompson: 2:24 

Yep, I'm gonna refrain from giving my toe analogy. All right, number two on my list of how not to build a dashboard, is to add lots of bells and whistles, to your report or your dashboard. And by this, I mean, things like, fitting in every extra chart, you can, and sparklines all over the place, regression lines.... You know, things that don't really help the report to be intuitive and do the two things I mentioned when we were talking about "the order", you know, answer the question intuitively and drive behaviors, leave them off. 

Caleb Ochs: 3:02 

And what we found is that you have to constantly return to that, right? Even after you build the initial version, iterations are going to come at you really fast. So you've got to be able to keep going back to, "Wait, remember, why do we need this? What's the goal here? What's this report for?" And be strict about what makes the cut and what doesn't, because it becomes really easy. And, you know, I think that's one of actually the problems with these self service, self service in air quotes, these bi visualization tools, is that they make it easy to do those types of things. And so then it becomes very tempting to just do it just to make people happy, right? 

Brick Thompson: 3:46 

Yeah, exactly. That's exactly right. And in fact, I think one of the reasons that people do end up adding lots of bells and whistles is they're really trying to give good value to the user. They're not trying to make it complicated or make it less effective. They're actually trying to make it better. And all of us sort of intuitively think that adding more stuff, more flashing lights and more KPIs is going to make it better. And in fact, it's often the opposite. You often make it better by taking things away. And we do that. We'll build a report and remove things from it right when it's finished because they don't actually add to it but rather clutter it. 

Caleb Ochs: 4:24 

Sometimes less is more. 

Brick Thompson: 4:25 

Right. So don't add the background image. Don't add the extra slicers, extra graphics on the page, unless they help the report to be more effective. 

Caleb Ochs: 4:36 

Exactly. 

Brick Thompson: 4:37 

All right. What's your next item? Not to do? 

Caleb Ochs: 4:41 

Yeah, don't do this. Again. Don't do this. So using colors for kind of aesthetic, purely aesthetic, really, rather than providing good information. I think I saw a, well I've seen this many times, but there's a graphic that circulates in the data visualization community, I guess where, I think it's from Stephen Few, or one of those guys, where it's a bunch of, it's an Excel chart, it's a column chart, and each column is a different color. And it says, you know, here's one way to look at it. And then there's another one, that's the same chart, only every column is gray, except one that's red. So then it calls your attention to that one that's like, this is where you need to pay attention. I think that captures pretty well, what I'm trying to say here. Don't just use colors for colors sake. Be very intentional about what you're using, and where. 

Brick Thompson: 5:38 

Yeah, that's a great example, I forgot about that. I've seen that one. In fact, we generally as a rule, reserve the colors red and green, and sometimes the color yellow as well, to only be used to indicate, you know, on target, or ahead of previous year or something like that, or, or behind. And don't use it for sort of graphic design, purpose. 

Caleb Ochs: 6:02 

Yeah, you want those colors to indicate something, like your attention needs to be here. There's plenty of use for other types of other colors on the report just to color charts and things. But you want to keep those reds and greens for things that you want to call out. 

Brick Thompson: 6:16 

Yeah, okay. All right. Well, speaking of graphic design, my next thing in how not to build a dashboard is adding lots of graphic design elements. And I already mentioned background images, they're kind of a pet peeve of mine. It seems like there'll be cool, but they definitely take away from the cleanness and readability. Also, just stay away from things that you can do, because the tool lets you, and it sort of looks cool, but it just adds visual noise, like 3D effects, or non standard shapes. Yeah, you could probably put your data card in a triangle, if you wanted to, but have a reason for that. And probably, in most cases, probably don't do that. 

Caleb Ochs: 6:57 

Yeah. I mean, I've seen so many reports, even stuff that, you know, I've built in the past where I was trying to be cool with it or do something different or, you know, just be creative. And it looks so cool at the time. And now you look at it, you're like, oh my god, what the hell was I thinking? So you really want to, you know, build reports that are going to give your best shot at standing the test of time, right? So you're not going to look at it in three months and be like, this is horrible. Alright, it might be cool when you do it at first, but you're just gonna get sick of it. And so are the people that use it. So keep it clean and simple. 

Brick Thompson: 7:35 

And I do think it's worthwhile to do nice design so that it's pleasing to the eye, and the user feels like "Oh, this is a professional tool that I'm engaging with." Just be careful of the gimmicky stuff. 

Caleb Ochs: 7:47 

Right, right, like a dark background or something. Yeah, you definitely want to make it look nice. That's a good point. We're not saying don't make it look good, but don't go crazy with it because it, it'll look really bad soon, like a pink porcelain toilet. 

Brick Thompson: 8:06 

I love my pink porcelain toilet. What are you talking about? Alright, let's see, what's the next one for you? 

Caleb Ochs: 8:16 

Skipping the date dimension. Date dimension is so incredibly important that it can't be understated [overstated]. You need to have that, right? It's not only for aesthetics and being able to do good time intelligence functions, but it also improves the stability of the report and the data model and gives everything a way to connect together based on their dates. It's just so incredibly important. You have to have your date table in your model. 

Brick Thompson: 8:48 

Yeah. In fact, when Power BI came out years ago, seven years ago, whatever that was... eight years ago, it didn't include a date table, so you had to build one yourself. And in fact, I think we still use our own custom date tables just because we like our own columns. But it's so important that Power BI makes it so you just hit a button and get a date table now. 

Caleb Ochs: 9:10 

Right? I mean, it shows how important it is. 

Brick Thompson: 9:12 

Yeah. All right. Next one for me is back to sort of the report visual design is skip the hidden features. Don't include "bonus" hidden features like, this sort of goes along with adding bells and whistles, but don't add drill throughs that aren't obvious that they're there and don't have an obvious need. It's kind of cool. And it's tempting to do it all over the place, but make sure it has a really good need. So, an example would be clicking on a column that goes through to a detail report for the data represented in that column. There are a lot of use cases where that makes sense. There's also somewhere it doesn't add anything and it actually clutters things. Another thing might be being careful of how you're using cross-filtering. So, you can define which  visuals on a report affect other visuals, which other visuals when you cross filter. Be careful about that and don't make it mysterious and do something tricky there. Make it easy and obvious and intuitive for the user, always. 

Caleb Ochs: 10:15 

Right? Yeah, there's a lot of times where you might have two visuals on a page that are purposely not supposed to interact with each other. Make that obvious that they're not supposed to be connected. Otherwise, somebody who's used to using Power BI, or another visualization tool that's interactive like that, will just be confused. Like, why is this not working? They'll think it's a bug or something. 

Brick Thompson: 10:38 

Right. And I've used reports, good reports, that actually had good tooltips. But it was sort of a mystery whether a tooltip was there. I found it by mistake on top of a card. And so make it obvious if you're going to do something like that. 

Caleb Ochs: 10:54 

Yeah. Agreed. 

Brick Thompson: 10:55 

All right. Next one for you. 

Caleb Ochs: 10:57 

All right. So in Power BI, there's a term called "implicit measures", which is if you just drag an integer column, for example, onto a chart, it may automatically sum or average or count that column, or the values in that column. Don't do that. Don't use those. Make sure that your your measures are explicitly defined using good DAX, keeping your performance top notch. 

Brick Thompson: 11:26 

Yeah, I mean, when I see that in more amateur built reports, it often leads to debugging issues. And that's one of the first things that I'll look for if I'm trying to debug something, is look for an implicit measure. 

Caleb Ochs: 11:40 

Yeah, And you know, they're not the end of the world, but they're definitely a telltale sign of someone who doesn't really know what they're doing. Right? So just put them in there. 

Brick Thompson: 11:49 

And I can think of times I've built little reports for myself where I've used them, just to be lazy. And I almost always regret it, because reports end up being something you use every day, and now you're going back and trying to fix all that stuff. So just do it from the start. Explicitly define your measures. 

Caleb Ochs: 12:03 

Yeah, I've done that, too. And every time I do it, I'm like, "Oh, this is working great. And then and then I'm like, Oh, wait, I got to do something different to this, rather than just a straight sum." So then I have to end up writing the measure anyway. And I don't know, just save yourself some time and do it up front. 

Brick Thompson: 12:18 

Yeah. Well, that kind of leads me to my next one, which is -- how not to build a dashboard -- Don't use random names, or just sort of creative names for tables and columns and measures and KPIs. You want to have a naming convention. Because when you come back to this and a year, or 18 months or two years, you want to make it really easy to understand what's going on in the data model. And I've made this mistake too, where I'm being lazy. I think I might only use this report once. Slap it together. I don't think about it. I don't follow my normal naming conventions. The report ends up getting picked up, and now a lot of people are using it, and now it's a pain to maintain. 

Caleb Ochs: 12:55 

Yeah, this reminds me of when Power BI first came out, the measure names and the column names that you had in your data model, that was what was going to show on the report. There's no way to change that. And that caused some really weird stuff. We sometimes had a measure named "period", like the just a period symbol, because we wanted it to not really show up in the table and just be like a column because it was obvious. But now since you can rename those measures and columns on the chart itself, be descriptive with your names, especially your measures, and columns. We have our own naming convention that we follow religiously in terms of reporting. But in your measure naming, make sure that you spell it out. And programmers are "weird", and people that are in technology. They like to shorthand stuff, and that can usually come back 

Brick Thompson: 13:48 

Yeah, definitely. All right, my next to bite you. one -- And you'll have to check me on this -- I advocate don't use a manual data refresh. So when you finish the report, make sure that it's set up to have some kind of scheduled refresh, so that people aren't looking at old data. Oh, and that reminds you of another one I should have put on the list was always have a freshness date on your report? 

Caleb Ochs: 14:11 

Oh, that's a good one. How did that one not make the list? 

Brick Thompson: 14:13 

I don't know if I forgot. 

Caleb Ochs: 14:15 

Well it did now. 

Brick Thompson: 14:16 

Yeah. So the user can see very obviously, usually near the title of the report, when was the last time this data was updated? And even better, what's the most recent data point in terms of date and time in the report. 

Caleb Ochs: 14:30 

Yeah, it's really important to drive adoption and maintain trust in the report, is if you have those little elements. They're very easy to do. And they go a long ways. 

Brick Thompson: 14:41 

Exactly. All right. Last one on the list, what is it? 

Caleb Ochs: 14:46 

So this one, this is a meaty one. But it's validation. We also think of QA and UAT, all the things that are kind of the final checks of your report before it goes out the door. Validation, I think, is super important. It can also be very, very challenging. Making sure that the numbers that you have, in your report match what you expect to see. QA, this one's kind of interesting. So I think, for this discussion, we'll just say QA is where, let's say you have a revenue number in a card at the top of the report, and then you have a table that shows revenue by customer. The sum of revenue by customer better match your top card. If it doesn't, then you've got something wrong. So you have to make sure that, you've got to go through your report -- make sure those things tie out. 

Brick Thompson: 15:37 

Yeah. So not just that it matches up with the source data, but that you have internal consistency as well. 

Caleb Ochs: 15:45 

Right, exactly. 

Brick Thompson: 15:46 

Yeah, this one's so important. It's, again, I've made this mistake. Build a quick report. I check a few data points. Everything looks good, and I'm running with it and find out that I didn't think about the edge cases enough. Obviously, when we're doing production reporting, we have a very strict and structured approach to that QA and validation so that we check the edge cases and make sure that we've hit everything that we can think of. But even when you're doing a casual report, take the time to do that. Because you never know how a report is going to get used, and you'd hate to give it to someone and have bad data in it. 

Caleb Ochs: 16:23 

Right, yup. Well said. 

Brick Thompson: 16:25 

All right. Well, we made a list of 10, "Top 10". We can probably come up with more now that I'm thinking about it. But that's our 10 items for How Not to Build a Dashboard. Any closing thoughts? 

Caleb Ochs: 16:40 

Don't do those. (laughing) 

Brick Thompson: 16:43 

Thanks, Caleb. Talk to you soon. 

Jon Thompson and Suzanne Rains

Written by Jon Thompson and Suzanne Rains

Jon Thompson is co-founder and Chief Strategy Officer at Blue Margin Inc. An author and speaker, Jon sheds light on how businesses can take advantage of a revolution in business intelligence to become data-driven and accelerate their success. Suzanne Rains is a communications specialist at Blue Margin Inc. With a MA in Human Resources and BAs in Marketing and Management, Suzanne unites an understanding of human nature and a keen interest in industry research to author thought leadership articles for today’s business leaders.