Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

October 2006 Archives

Perhaps the killer app for data quality improvement will come from the state.

Consider the numerous outside consumers of corporate information these days. It began a few years ago when data warehouses began being used for sharing information with suppliers, customers, prospects, vendors, etc. Why? Because they could. The data was there and some relative measure of data quality was being applied to the data, creating “good enough” data for those outside consumers. Though not first targets for the data warehouse, the data warehouse evolved to meet those needs. Coming on the heels of that is another potential level of consumer for not just data warehouse data, but potentially significant operational data – our government.

Now consider the last year of government solicitation of corporate data, much of it consumer-oriented and to the chagrin of civil libertarians and corporations alike. There are the public ones like internet search records and there are the company-by-company requests that information professional are working on fulfilling daily.

One day, if the government begins to treat corporate information as a state asset and grow frustrated with imperfect data quality in corporate submissions, we in information management could be living under a SOX-like environment.


Posted October 29, 2006 9:12 PM
Permalink | 1 Comment |

Some of the consequences of making inappropriate DBMS selection for DW/BI include:

• Long development cycles
• High numbers of support staff required
• Cost expansion
• “Throwing hardware at problems” as a solution
• Users reverting to old means of data access with user interfaces that are not friendly
• A technology-focused culture rather than a user culture in IT
• Complex vendor relationships
• Hard to incorporate legacy systems and unstructured data
• Inability to keep pace with growing data volumes and user demands
• Inability to show profitability from data warehouse efforts, leading to slow program demise


Posted October 18, 2006 11:50 AM
Permalink | 2 Comments |

Based on the realities of data warehousing today, a selected technical architecture for data warehousing should be:

• Manageable - Through minimal support tasks requiring DBA/System Administrator intervention. It should provide a single point of control to simplify system administration. You should be able to create and implement new tables and indexes at will.
• Complete and Integrated – The toolset should be comprehensive across the spectrum of eventual requirements for data and its access.
• Interoperable - Integrated access to the web, Microsoft Office, internal networks, and corporate mainframes.
• Scalable – The ability to handle increasing data sizes and workloads with simple additions to the architecture, as opposed to the increases requiring a rearchitecture
• Affordable – Proposed solution (hardware, software, services, required customer support) providing a low total cost of ownership (TCO) over a multi-year period.
• Proven and Supported – You don’t want to risk a critical decision regarding a fundamental underpinning of the data warehouse environment on an unproven solution.
• Flexible - Provides optimal performance across the full range of models with large numbers of tables. Look for proven ability to support multiple applications from different business units, leveraging data that is integrated across business functions and subject areas.
• User Accessible – Compatibilities and interoperability with data access tools that provide a wide range of user-friendly access options.


Posted October 12, 2006 8:47 AM
Permalink | No Comments |

I was just thinking about what the unique realities of data warehousing today are. As I see it, the top realities are:

• Multiple, complex applications serving a variety of users
• Exploding data size that will continue to explode with RFID, POS, CDR, and all manner of transactional data extending back years into history
• Data latency is becoming intolerable as needs demand real-time data
• A varied set of data access tools, serving a variety of purposes, for each data warehouse
• Multiple workloads streaming into the data warehouse from varied corners of the company as well as from outside the company
• A progression towards more frequent, even continuous, loading
• Data types running the gamut beyond traditional alphanumeric types


Posted October 10, 2006 9:05 PM
Permalink | 2 Comments |

The midmarket especially is still having a hard time with business intelligence. The appetite simply isn't there for "metadata", "data stewardship", "data quality", "XML", "ODS", "MPP" and the like. Simple reports - yes. Quick implementation - now you're talking. Stepping into this fray is venture-backed LucidEra - Hosted Business Intelligence on Demand.

While I have been bemoaning packaged approaches to BI for years, saying there are efficiencies but few shortcuts, I have to say the LucidEra approach has merit for its target customer base - the midmarket.

Highlights:

. Expert applications that answer the most commonly asked questions
. User-definable configurations, not done by vendor
. Focus on business processes
. Platform not exposed to customer

They're only in Beta and it's one application, Forecast-to-Billing, but just as in an end-user environment, once the data act is in order, applications can be spun off relatively quickly and I anticipate LucidEra will do this.

Could LucidEra be a (long-term) custom OLAP killer? Maybe. Time will tell, with the next few quarters being critical to that direction. Custom Microsoft solutions are also pretty easy to deploy and will be the choice of many in the midmarket, although Microsoft is eyeing the fortune accounts. Regardless, all signs point to improvement in the penetration of the midmarket by business intelligence.

As with anything "packaged", buyer beware. Just be ready and understand it fully. Hire a real DW/BI consultant to help you organize processes and roles and responsibilities for maintenance and iterating it.


Posted October 9, 2006 7:25 AM
Permalink | 2 Comments |


   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›