We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: Richard Hackathorn Subscribe to this blog's RSS feed!

Richard Hackathorn

Welcome to my blog stream. I am focusing on the business value of low latency data, real-time business intelligence (BI), data warehouse (DW) appliances, use of virtual world technology, ethics of business intelligence and globalization of business intelligence. However, my blog entries may range widely depending on current industry events and personal life changes. So, readers beware!

Please comment on my blogs and share your opinions with the BI/DW community.

About the author >

Dr. Richard Hackathorn is founder and president of Bolder Technology, Inc. He has more than thirty years of experience in the information technology industry as a well-known industry analyst, technology innovator and international educator. He has pioneered many innovations in database management, decision support, client-server computing, database connectivity, associative link analysis, data warehousing, and web farming. Focus areas are: business value of timely data, real-time business intelligence (BI), data warehouse appliances, ethics of business intelligence and globalization of BI.

Richard has published numerous articles in trade and academic publications, presented regularly at leading industry conferences and conducted professional seminars in eighteen countries. He writes regularly for the BeyeNETWORK.com and has a channel for his blog, articles and research studies. He is a member of the IBM Gold Consultants since its inception, the Boulder BI Brain Trust and the Independent Analyst Platform.

Dr. Hackathorn has written three professional texts, entitled Enterprise Database Connectivity, Using the Data Warehouse (with William H. Inmon), and Web Farming for the Data Warehouse.

Editor's Note: More articles and resources are available in Richard's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in 20081006-MSBI Category

NOTE: Part of a blog stream, with related items here.

Quentin Clark provided an overview of the SQL Server roadmap, emphasizing the data warehouse features. He remarked largest area of new or improved features was data warehousing. The DW features are categorized into three areas:

- Build Faster: SAP/Teradata/Oracle adapters, MERGE SQL stmt, Change Data Capture...
- Manage Easily: partition table parallelism, resource governor...
- Deliver Insights: Report Builder 2.0, viz, rendering of Excel/Word...

The major introduction to the roadmap is the integration of DATAllegro with SQL Server, code named Madison. Quention remarked, "We have customers banging on our door" to extend SQL Server into larger data warehouses. The goal of the Madison project is the Microsoft-ized version of DATAllegro, implying the migration from Ingres to SQL Server, from Linux to Windows Server, and from java code to .Net code. The first two were demos at the conference in an alpha version.

As described in a previous blog, Microsoft will partner with four major hardware vendors (HP, Dell, Hitachi, Groupe Bull) who will deliver an appliance to some degree. The reference architecture will be based on both database size and number of concurrent users.

The schedule flows as follows:

- First half of 2010: Version 1 of Madison with SQL Server Kilimanjaro (a partial release)
- 24-36 months from 2008 release in August: Version 2 aligned with the next normal version

My take on this... Microsoft is 18 months from having an MPP solution for scaling data warehouses into the 20-100 TB range. I have no doubt that Microsoft will eventually do it right, which may take longer than 18 months. Meanwhile, there will be lots of opportunities for small Microsoft solution providers and for larger DW competitors.

Posted October 7, 2008 4:22 PM
Permalink | No Comments |

NOTE: Part of a blog stream, with related items here.

The Gemini project is "a set of new, easy-to-use analysis tools for managed self-service, that will enable information workers to slice and dice data and create their own BI applications and assets to share and collaborate on from within the familiar, everyday Microsoft Office productivity tools they already use." (quote from Microsoft press release)

If you cut through the marketing jargon, the key phrase is Gemini = 'managed self-service analytics'. Extending an earlier blog, the two twins of Gemini are like two sides of a coin. One side represents power users who create and shared business analytics, and the other side represents IT professionals who manage the user-developed analytics from the enterprise perspective.

Amir Netz gave a demo that was more in-depth from the initial keynote sessions. It consisted of taking a mix of data sources and intelligently integrating this data through internal analysis of values (format, cardinality) and naming. The result was an internal ER diagram with 6-7 entities, with minimal human intervention. This data structure then drove the interaction with the business user to paint an Excel spreadsheet with various data viz.

My take on this... I am still struggling on what is new with this Microsoft initiative. The quest for pervasive BI is pervasive across the industry. I got a brief glimpse of insight when Amir said, "others deliver by adding their features to Excel; Microsoft delivers Excel." In other words, Excel will evolve into the analytic tool for everyone. Users will use Excel for analytics as a typical Excel user would expect. Does this make sense to you?

The other side of the Gemini coin is managing unmanageable growth of user-developed applications. ...and doing this in a graceful and effective manner. I heard a two-prong approach: building user communities that allow popular applications to percolate upwards; and exposing the resource impacts of applications on the infrastructure. Microsoft is not there yet, but their directions do feel promising.

Posted October 7, 2008 3:17 PM
Permalink | No Comments |

NOTE: Part of a blog stream, with related items here.

Microsoft recently completed its acquisition of DATAllegro, a data warehouse appliance vendor.

I interviewed Stuart Frost, founder and CEO, and Quentin Clark about their plans for integrating DATAllegro's technologies into Microsoft's product line. Several months ago, I had blogged about the acquisition and mused that Microsoft should be renamed to 'Microhard' since they were going to ship some 'real iron'. Well, my musings were a bit off. But, my prediction of the significance of the acquisition was correct.

The style of delivering a Microsoft data warehouse appliance is through partnership with four major hardware vendors (yet to be named but obvious to most). In early 2010, customers would purchase the MPP version of SQL Server through these vendors, following a strict 'reference architecture' specification. The pre-assembled rack would contain all components required to plug into power and networking. Minimal configuration would be needed by the customer. That is the plan.

An interesting aspect is the potential that DATAllegro's Hub-and-Spoke architecture could have for Microsoft's current SQL Server customer base. Many are pushing the capacity limits of the current SMP system. The MPP offering could be a comfortable migration path. Since SQL Server is typically used to support data marts, the MPP version could replace each of these data marts as the spokes. The company could then concentrate resources on the hub as their emerging enterprise data warehouse, while retaining their past investments. If Microsoft refined the tools and best practices for this 'data mart integration' strategy, it could be successful for them in retaining their DW customer base from the other big DW players.

Posted October 7, 2008 8:54 AM
Permalink | No Comments |

NOTE: Beginning of a blog stream, with related items here.

This week I am attending the Microsoft BI Conference in Seattle. With a single focus on business intelligence, it is a surprising popular conference attracting over 3,000 attendees from a broad spectrum. I sense that the majority are BI newbies who are just getting into data warehousing. However, it is a high energy, motivated, smart crowd, as are most Microsoft events.

I am particularly looking forward to an update on BI advances by Microsoft. I have phased out my coverage of Microsoft over the past 3-4 years, focusing more on enterprise-level data warehousing. With their acquisition of DATAllegro, Microsoft is filling in their enterprise-level DW offerings.

The morning keynotes are hosted by an old colleague Guy Weismantel who transitioned from Business Objects to Microsoft over the past year. The theme is to think bigger about BI. In other words, take all the things that we think about BI today and then extend BI to embedding smart information everywhere throughout the enterprise. Guy even mentioned embedding BI in the Xbox. "If you play Halo3, then you are using BI." I need to ponder a bit about that one...

The first keynote is by Stephen Elop, president of the business division, representing the office and productivity products. He reinforced the bigger vision for BI, which he summarized as providing the democractization of information across the enterprise. The goal is to have comprehensive solutions, familiar experiences (as in Excel), and high value. He cited four aspects to his inside perspective on Microsoft: strength of their ecosystem, intellectual integrity (high self-criticism with a willingness to fix problems), tenacity (investing for the long term), and opportunity for impact (making a difference).

Ted Kummert, VP data/storage platform division, set his mission to support ALL types of data and to provide enriched services for people-ready BI and SQL Server for data warehousing. The four pillars are: enterprise data, beyond relational (from facts and figures to sights and sounds), dynamic development (minimizing time to solution), and pervasive insight (brief mention of visualization).

Ted gave an overview of the Madison project that is the integration of DATAllegro technologies with SQL Server. The goal is to scale SQL Server into 100s of terabytes. Product delivery is scheduled for first quarter of 2010. Jesse Fountain gave a demo of the preliminary integration of DATAllegro and SQL Server. Jesse shows the MPP architecture with 24 instances of SQL Server 2008 supporting a 150 TB database. He proudly pointed out that the fact table of sales transactions having over one trillion rows. Three queries were run against the database the loading across the various processors and I/O buses was shown graphically. Cute!

Ted then introduced the Gemini project for self-managed services. The two stars of Gemini represented power users (who create and shared business analytics) and IT professionals (who manage the analytics from the enterprise perspective). Donald Farmer gave a demo of Gemini alpha code as an Excel add-in, against 20M rows with sorting and filtering, smashed it into nice pivot tables and then used smart slicers to drawn insights from the data. Impressive!

Microsoft has released SQL Server 2008 and plans normal versions to be released every 24-36 months. During the first part of 2010, they will release products from the Madison and Gemini projects. I am not sure how the Kilimanjaro project (next version of SQL Server) fits with the Gemini project.

My take so far... I do not understand how Microsoft is providing (or will provide) people-ready and IT-friendly solutions that are fundamentally different than other major BI/DW vendors. Is this the same as BI for the masses, which we have been discussing for several years? It seems to me that most of the marketplace is say the same thing. The Gemini Project is very intriguing and has promise; however, it facing a huge challenge.

I have another day and a half to go, with several interviews and podcasts. I am curious to see how my feelings toward Microsoft BI will change.

Posted October 6, 2008 1:00 PM
Permalink | No Comments |