How to Build Data That's Useful

Analytics and Stroller Pushing

One of the best analytical lessons I ever learned was nowhere near my computer. My wife and I were gearing up to have our first child. We were shopping for a baby stroller. If you have done this, you know the choices are paralyzing. There are at least 20 options that are rated on multiple qualities. After hours of debating what should have been painless choice, we stopped ourselves and asked, “what is the most important feature here”. After thinking about it, my wife said, “I want to be able to reach down with one hand (because the other will be holding the baby) and pick it up so it collapses, then toss it in the back of the RAV4 in one motion.” Suddenly, 20 options went down to 2 or 3, and we made a decision a minute after that.


Good data insight development follows this approach. It is not an attempt to build the Encyclopedia Britannica, it’s an agreement on what piece of currently unavailable information would make the most difference to the people who actually run the business. Here is a fun little video of me talking about this.

Back in 2011 I took a leap of faith. I left the stability of Pepsico to lead an analytics group in a much smaller Energy company. At that time, I was introduced to a new software called Tableau. It seemed pretty cool, and was easy to learn if you were a strong excel user. So off I went with my team to build reports from the database of company information we had put together.

One of the first and certainly most notorious reports we developed was for a “very eager” and attention-challenged marketing manager. The good news is that he loved data and believed in not making decisions without it. The bad news is that there was no end to the data that he felt he needed to look at.

My team went on to develop the report exactly the way that he wanted it, with all the different possible views and filters he could think of.  With this one report, he would be able to see everything, and answer every question that his directors could pose.

This is an example of what it looked like. My team gave it a name: “Filters Gone Wild.” No one else in the company could stand to use this report for more than two minutes without needing a glass of scotch.


So why to people do this? Isn’t it a noble intention, after all, to want to see more data? The reason is because complexity creates its own burden, As it turns out, consuming data is a lot like purchasing jam - more isn’t always better.  Not only is there a point of diminishing returns in how satisfied we are, but our ability to act is reduced significantly as well.

That was a really interesting role for me, and I’m glad I took it. Not only did I learn a lot of new, useful skills, but more importantly I got to see the gamut of “clients” and how they wanted data. The better ones understood this concept of simplification.

Around the same time, there was an article released by MIT, which put some science to what I was learning. They surveyed a few thousand people at multiple companies and determined that top performers were five times more likely to use analytics than lower performers. No surprise there, but what was more interesting was how the top companies approached data.  It wasn’t about budgets or sophistication of software; the lower performers cited development process and managerial issues as a major contributor to blocking progress. What - people are getting in the way?!?

A recent client experience motivated me to write this blog. The team had purchased all the software it needed to bang out good reporting. They had a small army of internal folks and contractors who could wrangle and structure the data as good as anyone. But when the six-month check-in time on a nine-month project came, they discovered that only rudimentary reporting had been developed, and that the internal clients were disappointed to the point of considering pulling the funding for the expensive software they purchased.

Why? Because the IT developers who were in charge of it had treated it as a requirements fulfillment exercise.

One of the key points of the MIT article was a concept they called “start in the middle”. In their findings, they saw a trend in the approach of effective teams where they would simplify the issue to discover the most relevant information to move the needle the most, and then iterate against that until they honed it to a useful state.

It’s a conversation between business people, that happens to use technology as a tool to make it come to life. There is no requirement to gather, because it’s never really known completely what is needed until the discovery begins. It’s not a conversation with executives, it’s with the frontline managers and directors who make the business happen. Once they start becoming successful, peers start taking notice and the path to a data-driven culture organically grows.