We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: Richard Hackathorn Subscribe to this blog's RSS feed!

Richard Hackathorn

Welcome to my blog stream. I am focusing on the business value of low latency data, real-time business intelligence (BI), data warehouse (DW) appliances, use of virtual world technology, ethics of business intelligence and globalization of business intelligence. However, my blog entries may range widely depending on current industry events and personal life changes. So, readers beware!

Please comment on my blogs and share your opinions with the BI/DW community.

About the author >

Dr. Richard Hackathorn is founder and president of Bolder Technology, Inc. He has more than thirty years of experience in the information technology industry as a well-known industry analyst, technology innovator and international educator. He has pioneered many innovations in database management, decision support, client-server computing, database connectivity, associative link analysis, data warehousing, and web farming. Focus areas are: business value of timely data, real-time business intelligence (BI), data warehouse appliances, ethics of business intelligence and globalization of BI.

Richard has published numerous articles in trade and academic publications, presented regularly at leading industry conferences and conducted professional seminars in eighteen countries. He writes regularly for the BeyeNETWORK.com and has a channel for his blog, articles and research studies. He is a member of the IBM Gold Consultants since its inception, the Boulder BI Brain Trust and the Independent Analyst Platform.

Dr. Hackathorn has written three professional texts, entitled Enterprise Database Connectivity, Using the Data Warehouse (with William H. Inmon), and Web Farming for the Data Warehouse.

Editor's Note: More articles and resources are available in Richard's BeyeNETWORK Expert Channel. Be sure to visit today!

August 2007 Archives

Sometimes you start reading a book with low expectations about its significance. But, the book surprises you and delivers a message of great significance. That has happened with a new book entitled The Black Swan: The Impact of the Highly Improbable by Nassim Nicholas Taleb. He is a professor of the Sciences of Uncertainty (an odd title) at the University of Massachusetts. See his Wikipedia entry and a PBS podcast.

Let me start with the bottom line. I strongly recommend this book for all professionals in Business Intelligence (BI) who care about the means and results of our profession upon our clients.

I have this naive belief that more information is better, assuming that the information is relevant to the business, properly cleansed, structured cross-functional, analyze appropriately, distributed to the right people and so on. This book totally negated that belief, instilling a humble attitude toward how much we can not know and shocking me about how much our current BI practices do damage to our clients.

And... I have just read the first few chapters. I am starting to be aware of the problems in general, confused about their implications to BI, and wondering whether there are any solutions. This is a book that will take several months to consume (because you read a few sentences, think "what?" and then reread it several more times).

Let me give a small taste of Taleb's argument. Before Australia was discovered, everyone knew that all swans were white, because all swans that were ever observed were white. Therefore, rule of nature was that all swans are white. Someone discovered a black swan in Australia. That one swan negated a belief held for a thousand years by all of mankind. Afterward, people concocted explanations as to why such a rare animal was perfectly normal and should have been expected. Taleb then extends this analogy to explain the events and aftermath of September 11, along with many other pivotal events in human history.

That is the Black Swan. It is a totally unexpected, but significant, rare event that seems plausible...afterwards. In Taleb's words, the Black Swan is an event with three attributes: "First, it is an outlier as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme impact [changing our basic paradigms that explain the world]. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable."

I submit that we are unprepared to handle the Black Swan with current BI technology and practices. In fact, current BI does more harm than good, by giving us a false sense of reliability in what we think we know and are clueless about what we do not know.

Help me with my struggle to understand the practical importance of the Black Swan. I would like to get a discussion established on Black Swan issues within the BI profession, along with joint publications with some of you. Is there anyone interested in this pilgrimage?

Posted August 31, 2007 11:18 AM
Permalink | 1 Comment |

Prof. Behnam Tabrizi of Stanford University has published a book whose title intrigued me. He has an impressive resume, having studied over 100 companies worldwide with McKinsey and written many publications, one of which received a scholarly award from Administrative Science Quarterly. Since I have researched the business value of low-latency data, I had high expectations that his book would give me insights into how real-time generates that value.

Prof. Tabrizi defines the Real-Time Enterprise (RTE) as based upon: getting the right data about the right processes to the right people at the right cost and at the right time to create and sustain competitive advantage. His thesis is that RTE failures are caused in three areas:

- Strategy: creating competitive advantage, reducing uncertainty and complexity
- Planning: achieving desired ROI, resolving critical discontinuities and latencies
- Implementation: capturing, monitoring, analyzing, interpreting data

He then divides the RTE into five modules: ERP, SCM, CRM, ERM (Employee) and PLM (Project Lifecycle). He illustrates his points with dozens of case sketches (mini-studies) from notable companies in a variety of industries.

I was disappointed with the book overall because I did not find those expected insights. In particular, I was concerned about:

First, the use of the term 'right' five times in RTE definition begs for more detail. What is 'right' is a subjective judgment that is left as an exercise for the reader.

Second, competitive advantage is overused giving the impression that RTE is one that is constantly looking over one's shoulders at what competitors are doing. This looking backward, rather than forward, gives the wrong message that innovation only comes from competitive threats.

Third, the five modules for RTE systems seem passé by oversimplifying and even fragmenting the enterprise into silo applications.

One glimmer of insights was contained in Table 2-1, which expanded on the factors for reducing complexity and uncertainty.

In summary, the case sketches are worth the price of the book, but lower your expectations for receiving insights into the mechanisms of real-time to deliver business value.

Posted August 27, 2007 9:50 AM
Permalink | No Comments |

The opening keynote was delivered by Philip Rosedale, CEO and president of Linden Labs, the creator of Second Life (SL). They affectionately referred to him El Presidente, reinforcing my impression of him as a visionary who is friendly and persuasive.

To his credit, he started humbly by admitting the poor reliability of Second Life. He wore a white t-shirt with big black block lettering saying "Missing Image", which occurs when SL has insufficient computing resources to properly construct the image of your avatar. This sent a strong message to his audience that he is fully aware of their concern for reliability. SL is just over 90% uptime including planned update outages. He quipped, "That's one nine, and it's better to have one nine than not any nines at all." I take this as a positive statement and hoping that at next year's conference he can argue that it is two or three nines.

Because SL is so complex, requiring constant innovations in grid computing. The next enhancement will be for different versions of the server to operate together. Eventually this innovation will avoid shutting down the entire grid for a version change several times per month.

Linden Labs was only received $20 million in venture capital to get to their current level of sustainable revenue. Rosedale predicted that, if the company had been traditional in its development strategies, they would never have been able to built SL to its level today. However, they have been driving SL toward better reliability. Over the last month, they introduced SL Voice, a major new capability of voice-to-voice chat sessions like Skype. At its peak usage, there have been 13 thousand people talking at the same time. All this was accomplished without disruptions in SL operations, as Rosedale proudly noted.

Over the past year, the international participation in SL has increased tremendously, to where residents from the US are only 25% of the total. SL is becoming a major influence on flattening the world and sharing cultures from one person to another.

Rosedale ended by predicting that SL will be bigger than the Web, when the technological problems are solved eventually. He remarked, "We do not appreciate how big this thing [SL] will get."

He is obviously a visionary, some of whom only blow hot air and some of whom change the world. So far, Linden Labs have accomplished a lot. But, the task ahead is a hundred or thousand times as large. Follow this one closely!

Posted August 25, 2007 11:29 AM
Permalink | No Comments |

A year ago, I attended the Second Life Community Convention (SLCC) on a whim. It coincided with a business trip and seems like a fun thing to do. I came away with a collage of fragmented thoughts. The event was a blending of a StarTrek convention with a school reunion. There were hugs everywhere among folks with alternative styles in dress and speech.

This year I returned to SLCC with a focus. Could this virtual world technology have an impact upon our profession in Business Intelligence (BI)? This seems like a ridiculous question, but I believe that it is not. I had the same feeling 15 years ago when the Web was beginning to have an impact on businesses. Remember that the Internet and early Worldwide Web was initially dominated by universities and research institutes. To post a commercial advertisement was against community rules. To send an unsolicited email was the depth of rudeness. Oh, how times have changed!

I wrote an article on Serious Games in Virtual Worlds where I argued that BI will evolve into four levels of serious games. The highest level has a close coupling of the real world of an enterprise with an abstract version in the virtual world. By analyzing, experimenting and planning in the virtual world, appropriate actions could be implemented in the real world toward a goal, such as servicing a customer. IBM has announced that a 24x7 customer center is operational in Second Life, where a real qualified employee will answer questions and discuss problems one-on-one with a customer.

Will virtual world technology, like Second Life, enhance the current two-dimensional Web with increased functionality? Or, will this technology open new opportunities, currently undreamt, for conduct the business of the future?

Posted August 24, 2007 11:00 AM
Permalink | No Comments |

As I was preparing for a briefing with Kim Stanick, VP of Marketing, I kept wondering what was different about ParAccel. Started in 2005, this new start-up makes some bold claims about speed, scalability and simplicity. I am not against being bold. But where is the meat?

Their DBMS engine uses compressed in-memory columnar processing with a shared-nothing massively parallel processing. Column-oriented goes back several decades. Compression is not new. In-memory databases also go back several decades. And, MPP is becoming common. So, what is new and different? It might lie in their strategy.

ParAccel is targeting the medium-sized companies (and above) who are maturing their BI system and experience pain with performance and scalability. The sweet point is between a half terabyte to ten terabytes of data. Avoiding a rip-and-replace strategy, their approach is to augment the existing system through an Amigo arrangement. Based on analytical complexity, incoming queries are routed to the database-of-record or to the ParAccel database. ParAccel is also offering their DBMS as an appliance through reseller arrangements with major hardware vendors.

ParAccel is a company to watch. I am still searching for an answer to my question, which should emerge over the coming months as they roll out their product and secure satisfied customers.

Posted August 23, 2007 7:31 PM
Permalink | No Comments |
PREV 1 2