By Christensen's definition, disruptive change occurs when a new technology has some feature that is not applicable in an existing market and performance characteristics worse than existing technology in that market, but capable of growing to meet that market's needs in time. What happens is that the new technology debuts in another, often related, market and then moves back into the original market, often displacing the existing suppliers there. Christensen's key example relates to the development of the disk drive market form the '70s to the '90s and the failure of many of the incumbent 14-, 8- and 5.25-inch drive manufacturers over that period.
What struck me at TDWI was the explosion in novel and even radical approaches to the database and storage side of data warehousing that were on view. While most of the technologies are not new, the combinations and price-points are certainly innovative and maybe disruptive. For many years, the DW database market has been very quiet, but the last couple of years has seen an explosion in new entrants. What the newcomers have in common, from the more established ones like Netezza to the more recent entrants like ParAccel, is a focus on query performance and large data volumes in specific analytical applications that might traditionally be called data marts.
As these vendors' technologies and techniques are proven in largely stand-alone environments, they are beginning to raise questions in the traditional enterprise daat warehouse arena. We've already seens the incumbents (Teradata, IBM, Oracle and Microsoft) introduce appliance-like solutions. But the real question I see relates to the underlying architecture of the data warehouse itself. After more than 20 years, are we about to see a fundamental change in the way we design business intelligence environments?
I'll be exploring this question over the coming months, but I'd love to hear your views at this stage!
Posted March 3, 2009 7:10 AM
Permalink | No Comments |