in BI Strategy & Execution, Data Management

Why Ad Hoc Reporting Won’t Work (part 2): We Need to Massage the Data First

Back in March I posted “Five Reasons Why Ad Hoc Reporting Won’t Work In Some Organizations“.  In that post, I listed some of the major concerns that I hear from potential clients when the subject of ad hoc reporting is broached.  As promised, I will be addressing each of those concerns and showing how they can be overcome.  We’ve already addressed “Our people aren’t smart enough to create their own reports.”  Now let’s turn our attention to data, and look at “We need to massage the data first.”

Why Reinvent the Wheel Each Month?

A few years ago we worked with a client who was brand new to the world of Business Intelligence.  They had a very small IT staff, and as such their IT Director was the one responsible for reporting information back to the Executive Team.

I asked the IT Director to take me through his reporting process so I could understand the workflow a bit better.  He started by opening a few Excel spreadsheets, showed me how he does a copy and paste from their various mainframe systems into Excel, and then how he massages the data.  All told, this process took him about 14 hours each week — just to get this information to the rest of the Executive Team.

Absent any real data integration platform, this process seemed to be a good alternative for where they were as a company at the time.  But as he dashed through Excel renaming dimensions here, altering data there…it was clear that manipulation aspect was consuming most of his time.  I asked him why they didn’t fix the data at the source system, and he commented that it was just easier to fix the data here each month rather than making changes to the source system, retraining the people entering the data, etc…  Now that might be alright for a one-time analysis, but why spend countless hours each month changing the same data in the same way?

Touch and Cleanse the Data Once

If this story sounds familiar, it should.  Virtually every client we meet has some degree of data quality issues.  Without a plan and routine maintenance, companies may find themselves having to deal with bad data on a continual basis, which can derail the success of any business initiative.  (I won’t even begin to go into the compliance issues that can arise if data is constantly being manipulated in Excel.)

Depending upon the operational system, you may want to implement transactional data quality and controls, where the data is validated as it’s being entered into the system.  For others, it may make more sense to batch scrub the data as part of your nightly processing.  Or, more likely, you’ll implement a hybrid of both approaches.

Whatever approach is taken, the key is to implement a process that allows you to touch and cleanse that data once.  Then it can be confidently made available to all your downstream systems without having to involve Excel in the process.  By ensuring that your organization has a solid data quality foundation, you can be confident that all your efforts are based on accurate and complete data.

Question:  What data quality challenges do you have in your organization?  What have you done to address them, and what was the outcome?

Free Web Intelligence Best Practices Guide

We put together a Best Practices Guide for Web Intelligence Development that includes over 20 pages of tips and techniques for developing business-ready reports. Plus we'll show you the common functions that have a negative impact on performance, how to create interactive reports, the best ways to standardize look-and-feel, and much more!


Powered by ConvertKit

Write a Comment


  1. What’s the boundary for ad hoc reporting?

    My client needs ad hoc reporting. We are asked to create a Universe that supports it. The underlying data model is transactional modelling. Client is in healthcare business.

    The client has no idea of what kind of reports they gonna generate. We are asked to provide a solution for a boundary-less problem statement. I’m confused what to do?