We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

July 2007 Archives

By now, I’m sure you’ve all heard of Oscar the Cat, the nursing home cat who, as the story goes, can predict when residents are going to die. Here is a link to the story. I have had some awkward stare-downs with cats this week as a result of learning about Oscar.

Anyway, all I have to say is that the first thing I thought of when learning about this feline forebearer of demise is that he has his counterparts in the business intelligence world. - specifically those consultants or employees whose mere presence on a project can signal its ruin. Sometimes I have to point out the Oscars and rarely, but occasionally, as a practice manager, my screening misfires and I end up with an Oscar on my team though, fortunately, it’s never been too late for the project.

The people on a project are extremely important to its success and generally not easy to find. I have been fortunate to surround myself with the best in the business.

We all know Oscars. Be careful not to be one or hire one!

Technorati tags: consultant, employee, project

Posted July 31, 2007 9:58 AM
Permalink | 1 Comment |

Many systems have struggled to add maximum value to their organizations as an indirect result of lacking classical OSS architectural standards, such as:
• Reasonably-scaled and separate test, development, and QA platforms
• Change management
• Business Continuance and disaster recovery planning.
• Robust backup management
• Physical isolation or thoughtful co-location based on shared characteristics
• User care measures such as support, training and built-in descriptive information
• Proactive performance planning
• Business roles and responsibilities
• Ongoing business tuning of program direction

They therefore may not seem stable enough for business user dependence and continued development.

However, there are several aspects of technology that enable the progression of these standards:

• The continued reduction in hardware costs, especially in disk technology, make it economically possible to have separate development, QA, and recovery systems.
• Improvement in database technology so that we now have the ability to insert massive amounts of data while the system is being accessed, enabling occasional (near) real-time data feeds
• Sophisticated ETL tools and “hub and spoke” systems and standards facilitate the establishment and operation of systems that move data from source systems to targets, and vice versa.

The main inhibitor, however, will probably continue to be the perception of such systems as strictly technical in nature, when in reality the fit into the business situation is just as important as the technical aspects.

Posted July 27, 2007 8:37 AM
Permalink | No Comments |

In the Data Warehousing industry, we are continuing to see the maturation of the value proposition and the management of risk. In the early days, the technology was experimental. Data Warehouse projects consumed $millions on nothing more than the promise of “if we build it, I’m sure it will pay for itself. After all, XYZ company found out something that caused their warehouse project to pay for itself in only six months!” Vendors were great at sending the message that “all of your competitors are building these systems in secret, because they consider it to be a competitive advantage. We would share more information, but we are under non-disclosure.”

The promise of striking gold in them thar hills of data was the subject of serious boardroom conversations. And those that failed to achieve the promise, either because the system was never built, or because it was delivered late and way over budget, or because they didn’t find the nuggets of gold they had hoped for, kept quiet. They didn’t want their colleagues or competitors to know.

Now it is generally known that Data Warehouse projects can fail, and have failed, and as a result, less of them actually do fail. We understand the risks and how to manage them.

Here are several of the factors that have contributed to our ever-increasing success:

• Adoption of an iterative deliverable methodology, where large projects are divided into 90-day deliverables and the projects with the greatest ROI and highest probability of success are done first. Scalable technology has contributed significantly to minimizing the risk in up-front capital investments.
• Dealing with the understanding that data quality is a major and must be evaluated up front, often times as part of an assessment. You can’t make a gourmet dinner out of garbage.
• An understanding that organizations must cooperate in order to integrate data, that project teams must be organized and executive sponsors identified accordingly.
• The technology to build, maintain, manage, and mine the systems is much better, and there are many more experienced technologists available.

Posted July 25, 2007 3:38 PM
Permalink | No Comments |

I've been busy, even to the point of looking at www.ilovealpacas.com once or twice.

I have a speaking enagement with SetFocus in Parsippany, NJ and available via a live stream on Friday, August 10. The title is "Modernizing and Advancing Information Management across the Enterprise" and the abstract is below.

Next week, I'll be at the "Bilderberg of business intelligence" - the Pacific Northwest BI Summit. Link to last year's event. It's hosted by Scott Humphrey, who I strongly recommend for any vendor in this space seeking the highest quality public relations. I will speak on RFID business intelligence and also take the opportunity to torture some members of the BI community. I know - I already do that frequently with my blog postings. However, this torture will come in the form of leading yoga sessions.

Abstract for SetFocus event:

Information is the frontier of modern business competitive pressures. Information mastered at an enterprise level greatly supports efficient corporate projects. Enterprise Information Management provides a consistent view and utilization of information throughout the enterprise.

For 10-20 years, data warehousing has been the center of the universe for Information Management. While enterprise data warehousing provides a tremendous amount of value to an organization, modern advances in Information Management technology allow for a more nuanced, yet effective, strategy for a broader Information Management. Central to this approach is Master Data Management. Data warehousing professionals future role is as Information Management professionals and this overview will assist in the bridging of invaluable tools of the trade into the IT organization’s capabilities.

In this seminar, we will discuss the following:

Multiple, complex applications serving a variety of users

Data warehousing is evolving, with new demands and needs placed on robust data warehouses

Exploding data size that will continue to explode with data types running the gamut beyond traditional alphanumeric types

Master data requirements in the operational environment as well as the data warehouse

The role of operational Business Intelligence within the information management architecture

Posted July 23, 2007 8:06 AM
Permalink | No Comments |

One of the most difficult things to do in data warehousing is to engage a new source system. Learning about the fields the system has to offer the data warehouse, when they are populated, how “clean” the fields are and when you can get after them with your extract job can be daunting. Then, after going through the process of attaching the extract jobs, scheduling and beginning the cycles, you would want to be set for a while.

Not so fast. Usually 1 day to 2 weeks after putting a data warehouse – any iteration – into production (or prototype), users who previously communicated requirements in abstract terms are now seeing the results and requiring changes. New fields and new transformations are not unheard of at this point.

Although data warehousing is very dynamic, it is possible for a practitioner to think beyond initial, spoken requirements and “prime the pump” by bringing additional fields into the ETL process. This concept, known as “triage” works very well if you have a staging area where initial loading from source is “dropped” prior to the majority of the transformations.

With triage and a staging area, the staging area can contain many more fields than are moved forward to the actual data warehouse. Then, if a new field is needed in the warehouse, there is no effect on the source extracts (and no accompanying disruption of source operation and negotiation with the source system team).

But wait, you say. "What about the historical data that usually accompanies such new data sourcing?"

The concept of the persistent staging area is to keep all data, both from a “triaged” (see yesterday’s tip) and a historical perspective in the staging area. That way, when requirements change post-production (again, see yesterday’s tip), you not only have the ETL “primed”, you also have the historical data primed and ready to be moved forward to the warehouse – in the persistent staging area.

Persistent staging areas almost always require a separate DBMS instance from the data warehouse DBMS due to the volume that will accumulate in them.

Since historical data is also kept in the warehouse, the distinctness for the persistent staging area lies in its capturing of triaged data, ready for historical loading of required data post-implementation. It will be bigger than the warehouse itself.

Although I usually do not use this technique in my data warehouses, if there was a high likelihood that requirements would be very dynamic after production and disk cost were not an issue, it would be very applicable.

Posted July 4, 2007 10:49 AM
Permalink | No Comments |


Search this blog
Categories ›
Archives ›
Recent Entries ›