Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

August 2011 Archives

In business intelligence, we all know and espouse the fact that data integration is the most time-consuming part of the build process.  This is undeniably true.  However, if one were to look at the long-term (me: not a full-time analyst, but observant of the implementations I've been in for a full lifecycle over the past few years), I believe most long-term costs clearly fall into the data access layer.   This is where the reports, dashboards, alerts, etc. are built.


This is true for a variety of reasons, not the least of which is a short-cutting of the data modeling process, which, when done well, minimizes the gap between design and usage.  This aspect of BI is receiving only modest recognition.  The focus instead is on a new breed of disruptive data access tools that are architecturally doing side-runs around the legacy tools in how they use memory and advanced visualization.  Specifically, these tools are Tableau, QlikTech, and Spotfire.  These tools attack a very important component of the long-term cost of BI - the cost of IT having to continue to do everything post-production.


There are a few areas where these tools are getting recognition:


  1. They perform faster - this allows a user, in the 30 minutes of time he has to do an analysis, to get to a deeper level of root cause analysis
  2. They are seen as more intuitive - this empowers the end user so they can do more, versus getting IT involved, which stalls a thought stream and introduces delay which can obliterate the relevancy
  3. They visualize data differently - I won't expound on it here and I don't think it's necessarily due to the tool architecture, but many claim it's better

So why do I bring it up in opposition to outsourced business intelligence?  Because to truly set up business intelligence to work in a self-service capacity, you would overweigh the idea of working closely with users in the build process, which is a lever that gets deemphasized in outsourced BI.  You would see business intelligence as less a technical exercise and more as an empowerment exercise.   You would keep the build closer to home, where the support would be.  And you would not gear up an offshore group to handle a laborious process of maintaining the data layer over the years in the way users desire.  You would invest in users - culture, education, information use - instead of outsourced groups.  And this is just what many are doing now. 


Posted August 14, 2011 10:52 AM
Permalink | 2 Comments |

I was at Teradata Influencer's Days this week, an annual 3-day invitation-only event where Teradata catches us up on the latest offerings and company strategy.  We were in Las Vegas this year and we had a fascinating visit to the Switch data center where eBay stores their Teradata EDW, Hadoop clusters and another large system where the thousands of jobs run daily to keep eBay on top of their game.

Teradata is undoubtedly a long-standing leader in information management.  They have been preparing for the heterogeneous future (or is it the heterogeneous present?) and diversifying their offerings for several years.  Teradata's moves should have everyone reconsidering any notion of Teradata as a high-hurdle company that wants you to put everything online in a single data warehouse.  And it seems to be working.  Teradata released earnings Wednesday showing revenue growth of 24 percent in 2Q11.

Aster Data - A "big data" acquisition for the management of the multi-structured data with patented SQL/MapReduce

Active Data Warehousing - Abilities built into the Teradata 5000 EDW series that support and promote fast, active, intra-day loading of the data warehouse as opposed to a batch-loaded warehouse

Aprimo - Marketing applications that put the information to work and a software-as-a-service model to build some of their future on

Master Data Management - The "system of record" for subject areas that need governance and need to be integrated in real-time, operationally

Hot-Cold Data Placement - Less-used data placed into lower-cost storage, with accompanying degraded performance

Appliance Family - Pre-loaded machines of varying specification according to workload that can get your data access up and running quickly; some are using the appliance for their data warehouse

I noted still something could be done where many analytics are going - to the operational world.  Something in complex event processing would further an information ecosystem.  

We discussed Teradata 14 and it will continue this theme of providing the range of platform options necessary today.

Now that some of these acquisitions are assimilated, we are seeing a reflection in the marketing.  With "Teradata Everywhere" as the imperative, the reference architecture is now the "Analytic Ecosystem" which is an environment that includes, but is not all-consumed by, the Enterprise Data Warehouse.  Consider the market sizes of the markets Teradata is going after, as shared by Teradata: Data Warehousing ($27B), Business Applications ($15B) and Big Data Analytics ($2B).  Teradata is embracing the heterogeneous future as a focused leader in information management.


Posted August 6, 2011 8:45 AM
Permalink | No Comments |


   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›