We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: Dan E. Linstedt Subscribe to this blog's RSS feed!

Dan Linstedt

Bill Inmon has given me this wonderful opportunity to blog on his behalf. I like to cover everything from DW2.0 to integration to data modeling, including ETL/ELT, SOA, Master Data Management, Unstructured Data, DW and BI. Currently I am working on ways to create dynamic data warehouses, push-button architectures, and automated generation of common data models. You can find me at Denver University where I participate on an academic advisory board for Masters Students in I.T. I can't wait to hear from you in the comments of my blog entries. Thank-you, and all the best; Dan Linstedt http://www.COBICC.com, danL@danLinstedt.com

About the author >

Cofounder of Genesee Academy, RapidACE, and BetterDataModel.com, Daniel Linstedt is an internationally known expert in data warehousing, business intelligence, analytics, very large data warehousing (VLDW), OLTP and performance and tuning. He has been the lead technical architect on enterprise-wide data warehouse projects and refinements for many Fortune 500 companies. Linstedt is an instructor of The Data Warehousing Institute and a featured speaker at industry events. He is a Certified DW2.0 Architect. He has worked with companies including: IBM, Informatica, Ipedo, X-Aware, Netezza, Microsoft, Oracle, Silver Creek Systems, and Teradata.  He is trained in SEI / CMMi Level 5, and is the inventor of The Matrix Methodology, and the Data Vault Data modeling architecture. He has built expert training courses, and trained hundreds of industry professionals, and is the voice of Bill Inmons' Blog on http://www.b-eye-network.com/blogs/linstedt/.

DDW (Dynamic Data Warehousing) is a couple years off (at least 3 to 5), but I still like to look into the future to find out what kinds of things we might create, and what the value of DDW might be. This entry is an exploratory entry - meant for discussion purposes, so please comment on what you think about DDW, it's feasibility, the timelines and anything else that may come to mind.

In my mind, to get to DDW we must take baby steps. Unless of course some engineer has a tremendous breakthrough and can create the necessary components quickly and easily. I think it may be possible to build a DDW today, at least for experimental purposes.

If I look at the raw components available today, I would suggest starting with a DW Appliance like Netezza, or Datallegro. Then, I would negotiate a deal with a data mining company to place their software on a hard-ware card that can plug and play with the Appliance. Next, we would engage the data mining during load streams as well as scheduling the neural net to mine the data in near-real time for associations and meanings. The twist would be the neural nets’ focus: which would be directed to work on structure and relationships rather than data itself. It would use the data to measure confidence levels for structure.

It would be as though we’d have a Structural Quality Engine built into firm-ware and hooked to super-fast I/O and high speed parallel systems; thus resulting in a semi-smart device. I would recommend starting off with simple rules and a 1, 2, 3 rating (as discussed in my previous posting). Where 1 would be Manual intervention required before structure change is made, 2 = warning, but change is made in place, 3 = notification that a change has occurred. Based on confidence level thresholds we would see this system stratify changes over time.

So what does DDW bring to the table?
An incredible ability to adapt structure of the Data Warehouse or historical integration store on the fly – that’s the technical side of it anyhow. From a business perspective it can translate to much lower maintenance costs, much lower response times from IT for adjusting to changes, much lower “new project” costs. “What If” questions could be asked of the production architecture without actually making changes. Rates of Structure changes could be monitored, projected, and gauged based on business partners whom are interacting with the SOA interfaces.

It would mean a better way to estimate “changes” to the data warehousing system, with hard-fast numbers and confidence levels to back it up. On the other hand, we’d have to pay for the engineering somehow, so up front costs would rise in order to lower TCO over time.

It means easier access to changes, more dynamic and flexible changes in the hands of the business users, and if the DDW is hooked to a business rules engine – or SOA Business Process Workflow, it can be “dialed in” to the changes that are coming down the pike (expected at the data level).
I’d love to hear what your thoughts are on this topic, all comments are welcome.


Posted June 30, 2005 12:44 PM
Permalink | No Comments |

Leave a comment

    
Search this blog
Categories ›
Archives ›
Recent Entries ›