Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

In business intelligence, we all know and espouse the fact that data integration is the most time-consuming part of the build process.  This is undeniably true.  However, if one were to look at the long-term (me: not a full-time analyst, but observant of the implementations I've been in for a full lifecycle over the past few years), I believe most long-term costs clearly fall into the data access layer.   This is where the reports, dashboards, alerts, etc. are built.


This is true for a variety of reasons, not the least of which is a short-cutting of the data modeling process, which, when done well, minimizes the gap between design and usage.  This aspect of BI is receiving only modest recognition.  The focus instead is on a new breed of disruptive data access tools that are architecturally doing side-runs around the legacy tools in how they use memory and advanced visualization.  Specifically, these tools are Tableau, QlikTech, and Spotfire.  These tools attack a very important component of the long-term cost of BI - the cost of IT having to continue to do everything post-production.


There are a few areas where these tools are getting recognition:


  1. They perform faster - this allows a user, in the 30 minutes of time he has to do an analysis, to get to a deeper level of root cause analysis
  2. They are seen as more intuitive - this empowers the end user so they can do more, versus getting IT involved, which stalls a thought stream and introduces delay which can obliterate the relevancy
  3. They visualize data differently - I won't expound on it here and I don't think it's necessarily due to the tool architecture, but many claim it's better

So why do I bring it up in opposition to outsourced business intelligence?  Because to truly set up business intelligence to work in a self-service capacity, you would overweigh the idea of working closely with users in the build process, which is a lever that gets deemphasized in outsourced BI.  You would see business intelligence as less a technical exercise and more as an empowerment exercise.   You would keep the build closer to home, where the support would be.  And you would not gear up an offshore group to handle a laborious process of maintaining the data layer over the years in the way users desire.  You would invest in users - culture, education, information use - instead of outsourced groups.  And this is just what many are doing now. 


Posted August 14, 2011 10:52 AM
Permalink | 2 Comments |

2 Comments

What about using these tools in conjunction with an internally done data platform? If we model out the data layer in conjunction with the business users. Build out and support the ETL solution such that the data and data management is maintained internally. Then just let these new tools access the data layer built and maintained by the the IT department.

Have you seen this work effectively?

Good idea Doug. I was addressing outsourcing as in moving the development hard to an outside group, like a consultancy instead of to in-company IT. I agree the ETL/acquisition component would need specialized skills, but I like the user empowerment for data access and sometimes it seems to take different tools to facilitate.

Leave a comment

    
   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›