ad hoc reporting

In the Business Intelligence (BI) world, there’s a great deal of focus on long-term report development without much focus on ad hoc as a unique development process. As new tools such as power BI, PowerPivot, and tabular models now allow for significantly improved self-service BI experiences, some of the need traditionally served by ad hoc reporting has become outdated. However, ad hoc analysis is still very much needed for situations such as exploring new data. Luckily, most of the principles of reporting development apply to ad hoc equally well, but there are a few differences that I find helpful to keep in mind when approaching ad hoc problems.

 

Managing Scope Creep and Streamlining Your Efforts

We’ve all encountered scope creep in report development. With ad hoc reporting, this can be an even larger concern. Since ad hoc reports generally have limited use, the work effort to add extra “nice to have” metrics is often difficult to justify. If the data is noncritical, spending four hours of development time could be an unwise time investment for a single “nice to have”.

Another point where ad hoc requires a different mindset is repeatability. Since ad hoc reporting is inherently exploratory, repeatability is often unnecessary. In fact, if making an ad hoc report repeatable involves a significant time investment, it often makes sense to avoid that additional effort, at least initially. With PowerPivot, PowerBI, Excel, and other tools offering extensive, easy-to-use data connection options, making a report repeatable often doesn’t necessarily add time at all. However, if significant transformation/modification of the data is needed, instead of building an ETL package and/or scheduled job to process and prep your data, or making a fully developed PowerPivot model, etc., it could be more sensible simply to handle those changes manually.

 

Evaluating Effort and Making Relational Connections

Next, keep in mind that ad hoc often focuses on new, unexplored questions. As a result, it may be more difficult to predict the work effort involved and may increase the difficulty of validation. Data could be inconveniently arranged or unknown, or there may be non-intuitive logic that must be applied. For example, a recent data source had a field [ESDelFlag], in which entries with a “1” were “deleted” records that should be excluded from any analysis. Also there’s often no available reporting with these measures, so manually checking a sample of independent records against the source system may be the only way to validate. Be sure to allow for this additional development time when providing work estimates.

Finally, if you do expose something interesting, there will often be follow-up requests for additional exploration. So as you work through the initial request, keep your eyes open for connections to other data sets. Take notes about relationships that might possibly come into play, metrics that could give insight into trends your report will show, etc. This is presumably new, unexplored info in one way or another, so you’ll already be taking a close look. Taking notes about these details now will be a comparatively small time investment. If you wait until the follow-ups start coming in before asking these questions, you may need to repeat that exploration.

 

Validating Results and Ensuring Quality

There are several points in ad hoc reporting where it’s tempting to act differently due to the expectation of quick turnaround, but it’s important to resist that temptation. Ad hoc reporting is frequently on a more abbreviated time schedule, which drives many of the above concerns – and given the limited time available, it’s often tempting to limit validation. However, limiting validation is probably one of the most common ad hoc reporting mistakes, and it can lead to mistrust of your work. Yes, you must be expedient, but don’t allow that to compromise your work quality. It’s often assumed by analysts that ad hoc comes with lowered expectations, but in my experience, that assumption is not necessarily shared among report consumers.

Finally, I spoke earlier about limiting the time spent making reports repeatable, but as a counterpoint, ad hoc reporting often does lead to full reporting development efforts. With that in mind, while you shouldn’t focus on making ad hoc reports themselves repeatable, you should be thinking about what will be required to make that transition later, and where possible, make sure your methods don’t complicate that process.

While from a distance, ad hoc reporting is not fundamentally different from longer-term report development, there are some minor differences that can lead to major issues – but if you can embrace the necessary mindset, these pitfalls are easily avoided.

The following two tabs change content below.

Richard Slaughter

Latest posts by Richard Slaughter (see all)

Share
This

Post a comment