Blog: Wayne Eckerson Subscribe to this blog's RSS feed!

Wayne Eckerson

Welcome to Wayne's World, my blog that illuminates the latest thinking about how to deliver insights from business data and celebrates out-of-the-box thinkers and doers in the business intelligence (BI), performance management and data warehousing (DW) fields. Tune in here if you want to keep abreast of the latest trends, techniques, and technologies in this dynamic industry.

About the author >

Wayne has been a thought leader in the business intelligence field since the early 1990s. He has conducted numerous research studies and is a noted speaker, blogger, and consultant. He is the author of two widely read books: Performance Dashboards: Measuring, Monitoring, and Managing Your Business (2005, 2010) and The Secrets of Analytical Leaders: Insights from Information Insiders (2012).

Wayne is founder and principal consultant at Eckerson Group,a research and consulting company focused on business intelligence, analytics and big data.

A balanced scorecard is a powerful tool for aligning an organization. It displays the metrics that represent the key drivers of long-term performance. In many ways, it's a visual representation of an organization's strategy, tailored to every department and individual.

Unfortunately, most organizations are operational in nature, not strategic. They focus on day-to-day tasks required to ship products on time and keep customers happy. While most organizations want to take a long-term view of the business, most are too busy fighting fires to focus on the big picture. And their corporate culture and funding processes undermine scorecard initiatives before the first metrics are even published.

To ensure the success of a balanced scorecard, organizations need to excel at managing change, or rather, getting an organization (and the individuals that comprise it) to change habits for addressing and solving problems. Rather than address the symptoms of issues, a scorecard requires organizations to identify the core drivers of change that lead to new levels of performance.

Whether you are creating a new scorecard or reviving an existing one, it's imperative that you build change management into your scorecard project. Otherwise, the scorecard won't gain traction and overcome operational inertia. There are three keys to ensure your balanced scorecard gets adopted and delivers lasting business value: 1) a committed CEO 2) a robust governance program and 3) a key performance indicator or KPI.

1. A Committed CEO

Since scorecards embody the strategy of an organization (or business unit or department), the primary sponsor is the top executive, or CEO for an entire organization. The CEO must desire organizational alignment and view the scorecard as a critical tool for achieving that goal. The CEO must be fully committed to the project and the scorecard methodology. And that commitment can't waver over time since it often takes months or years for the scorecard initiative to bear fruit and deliver performance improvements.

One indicator of an executive's commitment is his or her willingness to devote time to the project. Although most CEOs won't participate on a scorecard design team, they need to provide ample input upfront and feedback every step of the way. The CEO needs to ensure that the design team creates objectives and measures that align with his or her vision of the company. The CEO must also sell other senior executives about the need for the project and the validity of the methodology. The CEO must also convince these executives to spend time to participate in the project, and, most importantly, assign trusted lieutenants to serve on a scorecard design team.

Finally, the CEO must be willing to spend money on initiatives and resources to effect organizational change. Many CEOs proudly display newly minted strategy maps but never fully fund the activities required to change organizational behavior. Once the scorecard fails to register performance gains, they often lose faith in the measurement system, believing it doesn't accurately represent the uniqueness of their business and processes. In contrast, successful CEOs never waver in their commitment to fact-based measurement and decision-making. They continually modify scorecard initiatives and metrics to maintain relevancy as the business changes or until the scorecard reflects the organizational performance they desire.

2. Scorecard Governance

Secondly, a scorecard needs to gain organizational altitude before it can fly on its own. In operationally focused organizations (and which aren't?), this requires several new governance structures and processes:


  1. Stakeholders. A scorecard project has to be a group effort since it will ultimately affect many people. The project must have a charter that explains its purpose, benefits, and whom it will impact and how. It needs an executive steering committee that evangelizes and funds the project and leads political interference. It needs a working committee of people who define the objectives and metrics at the heart and soul of the scorecard. And most importantly, it needs to identify stakeholders, including the executive committee, whose support and input is required. It's important to get honest feedback from stakeholders about proposed objectives, metrics, and initiatives. Stakeholders, especially those on the front-lines whose performance will be measured by the new metrics, are the only people who really understand the feasibility of proposed metrics.

  2. Strategy Management Office. Once the scorecard is designed, a strategy management office (SMO) drives the process of embedding it into the fabric of the company. A SMO consists of one or more full- or part-time people who make sure the scorecard is populated with data, used to make executive decisions, and updated to reflect changes in the business. The SMO also helps shepherd the creation of additional, cascaded scorecards, so strategy management propagates through an entire organization. Most importantly, the SMO evangelizes the scorecard methodology and facilitates the other governance tasks below.

  3. Theme Teams. In balanced scorecard parlance, "themes" are the primary strategic objectives of the organization. Typically, there are three to five themes that represent an organization's three to five year strategy. A theme team is a cross-functional group of five and eight people who are experts in the theme or vested in the topic. The theme team evaluates relevant scorecard data, interprets the results for the executive team, and makes recommendations for adding, changing, or deleting metrics and objectives from the scorecard. Theme teams ensure that subject matter experts, not just executives, are vested in the scorecard and committed to making sure it delivers relevant results.

  4. Cascaded Scorecards. An executive scorecard is just the beginning of a scorecard initiative. If leaders know and monitor business strategy, but no one else does, then the scorecard can't help overcome organizational inertia. The SMO, with backing from the CEO, works with each member of the executive team to propagate scorecards in their functional areas. Each department designs a scorecard that represents its own strategic objectives as well as drives performance of its parent organization. By cascading scorecards, organizations propagate strategy to every nook and cranny of the organization so every person knows how they contribute to the performance of the whole.

  5. Strategic Expenditures. Scorecards need their own funding, not only to support the SMO and strategic planning exercises, but also to support initiatives that drive key areas of performance measured by the scorecard. Most companies have a bevy of initiatives already and most can be mapped to scorecard objectives and metrics. But inevitably, the organization must undertake new initiatives to drive required change. While these initiatives must pass formal review to receive funding, there are often smaller initiatives or strategic exercises that are best funded from a discretionary scorecard budget.

3. Key Performance Indicator

Even with a committed CEO and strong governance structures, a scorecard won't take root unless it achieves a quick win. A quick win bridges the disparate worlds of operational and strategy management and testifies to the power of a scorecard to effect needed change.

The best way to achieve a quick win is to identify one scorecard metric above all others to serve as the focal point for the organization. The metric should be operational in nature and touch multiple functional areas and processes. Improving the performance of this metric forces employees to share information, collaborate across departmental lines, and brainstorm new processes and ways of doing business. In short, a well-designed KPI creates a ripple effect across an organization, generating widespread performance gains.

The CEO is the fuel for a KPI. The CEO must publicly evangelize the importance of the KPI, monitor its performance religiously, and call accountable executives and managers immediately when performance dips below specified levels. Since no one wants to receive potentially career-limiting calls from the CEO, a KPI forces a whirlwind of change.

John King closely monitored on-time arrivals and departures when he turned around British Airways in the early1980s. Paul O'Neill turned Alcoa from an industry laggard to a high-flier by focusing on worker safety measures. Cisco CEO John Chambers has created a unified corporate culture from 125+ acquisitions by focusing on customer satisfaction metrics.

It should be noted that a KPI doesn't give a CEO or organization license to ignore the other metrics in a scorecard. Rather, a KPI gives credibility to the remaining metrics on the scorecard and teaches the organization how think and act strategically. Indeed, once an organization optimizes the performance of one KPI, it should elevate the next most important metric on the scorecard to KPI status. Meanwhile, the former KPI takes its rightful place in the scorecard or is revised to highlight new areas for improvement.

Summary. A balanced scorecard is a powerful agent of organizational change. But since organizations (and the individuals that comprise them) resist change at all costs, it's imperative that a scorecard project supports a vigorous change management strategy of its own. The key elements of a scorecard change management strategy are a committed executive, a robust governance program, and a key performance indicator wielded by a change-hungry CEO.


Wayne Eckerson is principal consultant of Eckerson Group, a business-technology consultancy that helps organizations turn data into insight and action. He regularly helps organizations design or revitalize dashboard and scorecard projects.


Posted May 13, 2014 12:24 PM
Permalink | No Comments |

Hadoop advocates know they've struck gold. They've got new technology that promises to transform the way organizations capture, access, analyze, and act on information. Market watchers estimate the potential revenue from big data software and systems to be in the tens of billions of dollars. So, it's not surprising that Hadoop advocates are eager to discard the old to make way for the new.

But in their haste, some Hadoop advocates have plied a lot of misinformation about so-called "traditional" systems, especially the data warehouse. They seem to think that by bashing the data warehouse, they'll accelerate the pace at which people adopt Hadoop and the "data lake". (See "Big Data Part I: Beware of the Alligators in the Data Lake"). This is a counterproductive strategy for a couple of reasons.

Evolution, Not Revolution. First, the data warehouse will be an integral part of the analytical ecosystem for many years to come. It will take many years (decades?) for a majority of companies to convert their data and analytics architecture to a data lake powered by Hadoop, if they do at all. Organizations simply have too much time, money, resources, and skills tied up with existing systems and applications to throw them away and start anew. The mantra of big data is evolution, not revolution. (To learn about these countervailing strategies, see "The Battle for the Future of Hadoop.")

Slippery Slope. Second, Hadoop is at the beginning of its journey, and while things look bright and rosy now, this new architecture will inevitably encounter dark times and failures, just like all new technologies. Thus, it's unwise for Hadoop advocates to take potshots at a mature technology, like the data warehouse, which has been refined in the crucible of thousands of real-world implementations. Just because there are data warehousing failures doesn't mean the technology is bankrupt or that a majority of organizations are eager to cast their data processing destiny to a new, untested platform whose deficiencies have yet to emerge.

Too Much to Bear. Many data warehousing deficiencies stem from the fact that the data warehouse has been asked it to shoulder a bigger load than it was designed to handle. A data warehouse is best used to deliver answers to known questions: it allows users to monitor performance along predefined metrics and drill down and across related dimensions to gain additional context about a situation. It isn't optimized to support unfettered exploration and discovery or to store and provide access to non-relational data.

But, since the data warehouse has been the only analytical game in town for the past 20 years, organizations have tried to shoehorn into it many workloads that it's not suited to handle. These failures aren't a blemish against the data warehouse as much as evidence of a lack of imagination about how best to solve various types of data processing problems. Fortunately, we now have other ways to capture, store, access, and analyze data. So, we can finally offload some of these workloads from our overburdened data warehouses and give them space to do what they do best--populate reports and dashboards with clean, integrated, and certified data.

A Process, Not a Technology. A final reason that Hadoop proponents shouldn't disparage the data warehouse is because the data warehouse is ultimately a process, not a technology. A data warehouse reunites an organization in electronic form (i.e. data) so that it can function as a single entity, not a conglomeration of loosely coupled fiefdoms. In this sense, the data warehouse will never go away.

The truth is that companies can implement a data warehouse with a variety of technologies and tools, including a data lake. Some are better than others, and none is sufficient in and of itself. But that is not the point: a data warehouse is really an abstraction, a logical representation of clean, vetted data that executives can use to make decisions. Without a data warehouse, executives run blind, making critical decisions with inaccurate data or no data at all.

So, despite what some critics say, the data warehouse is here to stay. It will remain a prominent fixture in analytical environments for many years to come.


Posted April 10, 2014 2:41 PM
Permalink | No Comments |

Say you have a ton of data in Hadoop and you want to explore it. But you don't want to move it into another system. (After all, it's big data so why move it?) But you don't want to go through the hassle and expense of creating table schemas in Hadoop to support fast queries. (After all, this is not supposed to be a data warehouse.) So what do you do??

You Hunk it. That is, you search it using Splunk software that creates virtual indexes in Hadoop. With Hunk, you don't have to move the data out of Hadoop and into an outboard analytical engine (including Splunk Enterprise). And you don't need to create table schemas in advance or at run time to guide (and limit) queries along predefined pathways. With Hunk, you point and go. It's search for Hadoop, but more scalable and manageable than open source search engines, such as SOLR, according to Splunk officials.

Hunk generates MapReduce under the covers, so it's not an interactive query system. However, it does stream results immediately once the job starts, so an analyst can see whether his search criteria generates the desired results. If not, he can stop the search, change the criteria, and start again. So, it's as interactive as batch can get.

Also, since Hunk is a Hadoop search engine, you cannot do basic things you can do with SQL, such as join tables or add up columns easily or store data in a more compressed format. But it does let you search or explore data without specifying schema or other advanced setup.

And unlike Splunk Enterprise which only runs against log and sensor data, Splunk Hunk (gotta love that product name) can run against any data because it processes data using MapReduce. For instance, Hunk can search for videos with lots of red in them by invoking a a MapReduce function that identifies color patterns in videos. You can also run queries that span indexes created in Splunk Enterprise and Hunk, making Hunk a federated query tool. And like Splunk Enterprise, Hunk supports 100+ analytical functions, making it more than just a Hadoop search tool.

So, if you're in the market for a bonafide exploration tool for Hadoop, try Hunk.

For more information, see www.splunk.com.


Posted March 17, 2014 7:22 PM
Permalink | No Comments |

As silver bullets go, the "data lake" is a good one. Pitched by big data advocates, the data lake promises to speed the delivery of information and insights to the business community without the hassles imposed by IT-centric data warehousing processes. It almost seems too good to be true.

With a data lake, you simply dump all your data, both structured and unstructured, into the lake (i.e. Hadoop) and then let business people "distill" their own parochial views within it using whatever technology is best suited to the task (i.e. SQL or NoSQL, disk-based or in-memory databases, MPP or SMP.) And you create enterprise views by compiling and aggregating data from multiple local views. The mantra of the data lake is think global, act local. Not bad!

Data Lake Benefits. Assuming this approach works, there are many benefits. First, the data lake gives business users immediate access to all data. They don't have to wait for the data warehousing (DW) team to model the data or give them access. Rather, they shape the data however they want to meet local requirements. The data lake speeds delivery and offers unparalleled flexibility since nobody or no thing stands between business users and the data.

Second, data in the lake is not limited to relational or transactional data--the traditional fare served by data warehouses. The data lake can contain any type of data: clickstream, machine-generated, social media, and external data, and even audio, video, and text. It's a proverbial cornucopia of data delights for the data digerati.

Third, with a data lake, you never need to move the data. And that's important in the era of big data. The data streams into the lake and stays there. You process it in place using whatever technology you want and serve it up however users want. But the data never leaves the lake. It's one big body of water with many different fishing spots, one for every type of sportsman.

So, there is a lot to like about the data lake: it empowers business users, liberating them from the bonds of IT domination; it speeds delivery, enabling business units to stand up applications quickly; and it ushers in new types of data and technology that lower the costs of data processing while improving performance. So what's the problem?

Alligators in the Swamp

Uncharted territory. Although big data advocates are quite adept at promoting their stuff (and even better at bashing the data warehouse), they never tell you about the alligators in the swamp. Since very few companies have actually implemented a data lake, perhaps no one has seen the creatures yet. But they are there. In fact, the first razor-toothed amphibian that should cause your adrenalin to surge is the fact that the data lake is uncharted water. This stuff is so new, that only real risk-takers are willing to swim in the swamp.

Expensive. The risk, however, presents a great sales opportunity. Product and services vendors are more than willing to help you reap the benefits of the data lake, while minimizing the risk. Don't have Hadoop, MPP, in-memory engines, or SQL-on-Hadoop tools or any experience managing them? No problem, we can sell and implement those technologies for you. Don't know how to distill local and enterprise views from the lake? No worries, our consultants can help you architect, design, and even manage the lake for you. All you need to take a swim is money, and lots of it! That's the second danger: the threat to your budget.

Data governance. The biggest peril, however, is the subtle message that it's easy to create any view you want in the data lake. Proponents make it seem like the data lake's water has magical properties that automatically build local and enterprise views. But diving into the details, you discover that the data lake depends on comprehensive master data management (MDM) program. Before you can build views, you need to define and manage core metrics and dimensions, ideally in a consistent way across the enterprise. You then link together virtual tables using these entities to create the local or enterprise views you want.

The problem with this approach is that MDM is hard. It's hard for the same reason data warehousing is hard. Defining core entities is a business task that is fraught with politics. No one agrees how to define basic terms like "customer" or "sale". Therefore, the temptation is to simply build local solutions with local definitions. This meets the immediate need but does nothing to create a common view of the enterprise that executives need to run the business. An organization with lots of local views but no corporate view is like a chicken with its head cut off: it dashes madly here and there until it suddenly drops dead.

Of course, if your organization has invested in MDM, then building enterprise views in a data lake is easy. But the same is true of a data warehouse. When an MDM solution assumes the burden of reconciling business entities, then building a data warehouse is a swim in the lake, so to speak.

Courting Chaos

Let's be honest: the data lake is geared to power users who want and need immediate access to all data as well as business units that want to build their own data-driven solutions quickly without corporate IT involvement. These are real needs and the data lake offers a wonderful way to address them.

But please don't believe that a data lake is going to easily give you enterprise views of your organization populated with clean, consistent, integrated data. Unless you have a full-fledged MDM environment and astute data architects, the data lake isn't going digitally unify your organization. And that will disappoint the CEO and CFO. To make the data lake work for everyone requires a comprehensive data governance program, something that few organizations have implemented and even fewer have deployed successfully.

Ultimately, the data lake is a response to and antidote for the repressive data culture that exists in many companies. We've given too much power to the control freaks (i.e. IT architects) who feel the need to normalize, model, and secure every piece of data that comes into the organization. Even data warehousing professionals feel this way; they have developed and evangelized more agile, flexible approaches to deliver information to the masses.

Frankly, the data lake courts chaos. And that's fine. We need a measure of data chaos to keep the data nazis in check. The real problem with the data lake is that there are no warning signs to caution unsuspecting business people about the dangers lurking in its waters.

In subsequent posts, I'll debunk the criticisms of the data warehouse by the data lakers and present a new reference architecture (actually an ecosystem) that shows how to blend the data lake and more traditional approaches into a happy, harmonious whole.


Posted March 12, 2014 4:49 PM
Permalink | 6 Comments |

The cloud eliminates the need to buy, install, and manage hardware and software, significantly reducing the cost of implementing BI solutions while speeding delivery times.

One new company hoping to cash in on the movement to run BI in the cloud is RedRock BI, which offers a complete BI stack in the cloud starting at $2,500 a month for up to 2TB of data. The service runs on Amazon EC2, leverages Amazon RedShift, and comes with a single-premise cloud upload utility, 120 hours of Syncsort's ETL service, a five-user license to the Yellowfin BI tools, and five hours of RedRock BI support.

This makes RedRock BI an order of magnitude cheaper than any other full-stack BI solution on the market, according to Doug Slemmer, who runs RedRock BI. And customers can expand their implementations inexpensively, he says. An additional 2TB of data costs $650 a month, 120 hours of Syncsort ETL costs $750 a month, and additional Yellowfin users go for $70 a month each.

Slemmer doesn't expect RedRock BI's current pricing advantage to continue indefinitely. He expects other firms will soon combine off-the-shelf BI services and tools to create affordable cloud-based BI packages for the mid-market and departments at larger companies. As a result, Slemmer said he hopes to capitalize on RedRock BI's first-mover advantage by aggressively promoting its services.

RedRock BI released its cloud service for general availability on February 28, and has one paying customer, Dickey's Barbecue Pit. Several more are conducting five proofs of concept. For more information, go to www.redrockbi.com.


Posted March 4, 2014 5:21 PM
Permalink | No Comments |

1 2 3 4 NEXT

Search this blog
Categories ›
Archives ›
Recent Entries ›