Blog: Wayne Eckerson Subscribe to this blog's RSS feed!

Wayne Eckerson

Welcome to Wayne's World, my blog that illuminates the latest thinking about how to deliver insights from business data and celebrates out-of-the-box thinkers and doers in the business intelligence (BI), performance management and data warehousing (DW) fields. Tune in here if you want to keep abreast of the latest trends, techniques, and technologies in this dynamic industry.

About the author >

Wayne has been a thought leader in the business intelligence field since the early 1990s. He has conducted numerous research studies and is a noted speaker, blogger, and consultant. He is the author of two widely read books: Performance Dashboards: Measuring, Monitoring, and Managing Your Business (2005, 2010) and The Secrets of Analytical Leaders: Insights from Information Insiders (2012).

Wayne is currently director of BI Leadership Research, an education and research service run by TechTarget that provides objective, vendor neutral content to business intelligence (BI) professionals worldwide. Wayne’s consulting company, BI Leader Consulting, provides strategic planning, architectural reviews, internal workshops, and long-term mentoring to both user and vendor organizations. For many years, Wayne served as director of education and research at The Data Warehousing Institute (TDWI) where he oversaw the company’s content and training programs and chaired its BI Executive Summit. He can be reached by email at weckerson@techtarget.com.

May 2011 Archives

Data vortex.jpg

I just finished writing the first draft of my upcoming report titled, "Creating an Enterprise Data Strategy: Managing Data as a Corporate Asset." This is a broad topic these days, even broader than just business intelligence (BI) and data warehousing. It's really about how organizations can better manage an enterprise asset--data--that most business people don't value until it's too late.

After spending more than a week reviewing notes from many interviews and trying to formulate a concise, coherent, and pragmatic analysis without creating a book, I can distill my findings into a couple of bullet points. And since I am still collecting feedback from sponsors and others, I welcome your input as well!

  • Learn the Hard Way. Most business executives don't perceive data as a vital corporate asset until they've been badly burned by poor quality data. It could be that their well-publicized merger didn't deliver promised synergies due to a larger than anticipated overlap in customers or customer churn is increasing but they have no idea who is churning or why.
  • The Value of Data. Certainly, there are cost savings from consolidating legacy reporting systems and independent data marts and spreadmarts. But the only way to really calculate the value of data is to understand the risks poor quality data poses to strategic projects, goals, partnerships, and decisions. Since risk is virtually invisible until something bad happens, this is why selling a data strategy is so hard to do.
  • Project Alignment. Even with a catastrophic data-induced failure, the only way to cultivate data fastidiousness is one project at a time. Data governance for data governance's sake does not work. Business people must have tangible, self-evident reasons to spend time on infrastructure and service issues rather than immediate business outcomes on which they're being measured.
  • Business driven. This goes without saying: data strategy and governance is not an IT project or program. Any attempt by executives to put IT in charge of this asset is doomed to fail. The business must assign top executives, subject matter experts, and business stewards to define the rules, policies, and procedures required to maintain accuracy, completeness, and timeliness of critical data elements.
  • Sustainable Processes. The ultimate objective for managing any shared service is embed its care and tending into business processes that are part of the corporate culture. At this point, managing data becomes everyone's business and no one questions why it's done. If you try to change the process, people will say "This is the way we've always done it." This is a sustainable process.
  • Data Defaults. In the absence of strong data governance, data always defaults to the lowest common denominator, which is first and foremost, an analyst armed with a spreadsheet, and secondly, a department head with his own IT staff and data management systems. This is kind of like the law of entropy: it takes a lot of energy to maintain order and symmetry but very little for it to devolve into randomness.
  • Reconciling Extremes. The key to managing data (or any shared services or strategy) is to balance extremes by maintaining a free interplay between polar opposites. A company in which data is a free-for-all needs to impose standard processes to bring order to chaos. On the other hand, a company with a huge backlog of data projects needs to license certain people and groups to bend or break the rules for the benefit of the business.
  • A Touch of Chaos. Instead of trying to beat back data chaos, BI managers should embrace it. Spreadmarts are instantiating of business requirements so use them (and the people who create them) to flesh out the enterprise BI and DW environment. "I don't think it's healthy to think that your central BI solution can do it all. The ratio I'm going for is 80% corporate, 20% niche," says Mike Masciandaro, BI Director at Dow, talking about the newest incarnation of spreadmarts: in-memory visualization tools.
  • Safety Valves - Another approach to managing chaos is to coopt it. If users threaten to create independent data marts while they wait for the EDW to meet their needs, create a SWAT team to build a temporary application that meets their needs. If they complain about the fast and dirty solution (and you don't want to make it too appealing), they know there is a better solution in the offing.
  • Data Tools. There has been a lot more innovation in technology than processes. So, today, organizations should strive to arm their data management teams with the proper tool for every task. And with the volume and types of data accelerating, IT professionals need every tool they can get.

So what did I miss? If you send me some tantalizing insights, I just might have to quote you in the report!


Posted May 19, 2011 9:25 AM
Permalink | 2 Comments |

Universe.JPG

I've been reading the book, "The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos," by Brian Greene, a professor of physics and mathematics at Columbia University and well-known superstring theorist. I was startled in chapter 9 to read that the next dominant theme in physics is "information." In fact, he posits that the reality we experience in three-dimensions is actually a holograph driven by information on the "boundary surface" of space.

What he's talking about (I think) is something BI professionals know well. We call it metadata: the information that describes the data or facts of reality which we capture in our computer systems. Metadata drive applications (at least in the best designed systems), which execute tasks, such as capture orders, issue queries, and control inventory. In essence, metadata are the brains of applications that make action possible.

This seems to be what Greene is getting at. But I'll let him explain:

"During a lunch we had at Princeton in 1998, I asked [John Wheeler, one of the twentieth -century physics' most celebrated thinkers] what he thought the dominant theme in physics would be in the decades going forward. As he had already done frequently that day, he put his head down, as if his aging frame had grown weary of supporting such a massive intellect. But now the length of his silence left me wondering, briefly, whether he didn't want to answer or whether, perhaps, he had forgotten the question. He then slowly looked up and said a single word: 'information'...."

"Traditionally, physics focuses on things--planets, rocks, atoms, particles, fields--and investigates the forces that affect their behavior and govern their interactions. Wheeler was suggesting that things--matter and radiation--should be viewed as secondary, as carriers of a more abstract and fundamental entity: information. It's not that matter and radiation were somehow illusory; rather, he argued that they should be viewed as the material manifestations of something more basic. He believed that information--where a particle is, whether it is spinning one way or another, whether its charge is positive or negative, and so on--forms an irreducible kernel at the heart of reality. That such information is instantiated in real particles, occupying real positions, having definite spins and charges, is something like an architect's drawings being realized as a skyscraper. The fundamental information is in the blueprints. The skyscraper is but a physical realization of the information contained in the architect's design."

"From this perspective, the universe can be thought of as an information processor. It takes information regarding how things are now and produces information delineating how things will be at the next now, and the now after that. Our senses become aware of such processing by detecting how the physical environment changes over time. But the physical environment itself is emergent; it arises from the fundamental ingredient, information, and evolves according to the fundamental rules, the laws of physics."

This is astonishing. Greene is basically saying that the fundamental building block of the universe is not some particle, but information. And that information (as he explains later) exists outside our "reality"--on the boundary surface of our universe, as informed by research on black holes done by Stephen Hawking.

If this is true, then perhaps BI professionals have an intuitive feel for the inner working of the universe since we model reality in metadata to run applications and guide behavior. According to Greene, the Universe does likewise: it uses metadata to guide Earthly reality. Interesting!


Posted May 4, 2011 6:43 PM
Permalink | 8 Comments |