Blog: Wayne Eckerson Subscribe to this blog's RSS feed!

Wayne Eckerson

Welcome to Wayne's World, my blog that illuminates the latest thinking about how to deliver insights from business data and celebrates out-of-the-box thinkers and doers in the business intelligence (BI), performance management and data warehousing (DW) fields. Tune in here if you want to keep abreast of the latest trends, techniques, and technologies in this dynamic industry.

About the author >

Wayne has been a thought leader in the business intelligence field since the early 1990s. He has conducted numerous research studies and is a noted speaker, blogger, and consultant. He is the author of two widely read books: Performance Dashboards: Measuring, Monitoring, and Managing Your Business (2005, 2010) and The Secrets of Analytical Leaders: Insights from Information Insiders (2012).

Wayne is currently director of BI Leadership Research, an education and research service run by TechTarget that provides objective, vendor neutral content to business intelligence (BI) professionals worldwide. Wayne’s consulting company, BI Leader Consulting, provides strategic planning, architectural reviews, internal workshops, and long-term mentoring to both user and vendor organizations. For many years, Wayne served as director of education and research at The Data Warehousing Institute (TDWI) where he oversaw the company’s content and training programs and chaired its BI Executive Summit. He can be reached by email at weckerson@techtarget.com.

Recently in Visualization Category

Silhouette of Girl.JPG

More than one executive has approved the purchase of a new dashboard, BI tool, or mobile display by virtue of the visual interface alone. Just as sex sells magazines, pretty visual displays sell BI tools.

Certainly, a sleek, Flash-based interface is a breath of fresh air in the business intelligence (BI) arena, where text-heavy, static reports or spreadsheets still rule the day. And what's not to like? New Web 2.0 interfaces invite users to interact with charts and tables and offer superior performance (once fully downloaded) compared to HTML-only applications.

But just as beauty is some times skin deep, so too are visual interfaces. Users can quickly tire of a visual interface that makes them work harder to view relevant data or doesn't offer additional value compared to viewing textual data alone. Although sexy graphical interfaces may sell BI tools, it's the data that delivers true fulfillment.

Here are a few pointers to harness visualization to sustain user adoption and interest over the long haul.

1. Data first. Make the visualization subservient to the data, not the reverse. Ultimately, users want data, not visual flash and sizzle. Avoid gratuitous decoration.

2. Highlight the message. Every data set has a message to communicate. Find out the message and use visual techniques to highlight that message and deemphasize the rest.

3. Use visuals to compare. It is hard to see trends, compare items, or spot outliers and clusters in a table of numbers. This is where visualization shines. Use line graphs to show trends over time, bar charts or heat maps to compare items, and scatterplots to identify outliers and clusters.

4. Use visual techniques to show more data. A skillful designer steeped in the art of visualizing quantitative information can compress a lot of data into a compact space without cluttering the page or obscuring the data's message. Proper use of fonts, labels, and borders can clearly delineate dense displays of items; compact chart types, such as sparklines and bullet graphs show more data in a smaller space; and proper use of visual cues (i.e., preattentive processing techniques) can focus the viewers' eyes on the key messages in the data.

5. Establish visual standards. Standardize the placement of filters, toolbars, help buttons, breadcrumbs, alerts, links to additional information so users can navigate your displays almost from memory. Also, associate specific chart types with certain types of data. For example, if you always use a spiderchart to display patient satisfaction data, then already know what data to expect when they see that chart type.

6. Balance sparsity and density. Put fewer items on a visual display when users only have time to glance at the data not analyze it. For example, salespeople, call center representatives, operational workers, and busy executives want and need sparse displays. Analysts and managers want and need denser displays. Also, when first deploying a dashboard, err on the side of sparsity, not density. Once users become more familiar with the display, then add more items to it. Denser displays require users to click fewer times to get the information they need.

7. Iterate. Remember, no visual display is ever perfect. There are a million ways to tweak a visual design to improve its clarity and better communicate the underlying message in the data. Also, user preferences and needs change so you will need to adapt your visual designs accordingly. Just as a writer knows that the key to writing is rewriting, a visual designer knows that designing quantitative displays is a highly iterative process.

Creating an effective visual design is challenging, but ultimately it's the icing on the cake of a BI application. Follow the above techniques and you should have a delightful time icing your BI cake!


Posted April 1, 2011 8:55 PM
Permalink | No Comments |

In a recent blog ("What's in a Word: The Evolution of BI Semantics"), I discussed the evolution of BI semantics and end-user approaches to business intelligence. In this blog, I will focus on technology evolution and vendor messaging.

Four Market Segments. The BI market is comprised of four sub-markets that have experienced rapid change and growth since the 1990s: BI tools, data integration tools, database management systems (DBMS), and hardware platform. (See bottom half of figure 1.)

Thumbnail image for Thumbnail image for BI Market Evolution.jpg

Compute Platform. BI technologies in these market segments run on a compute infrastructure (i.e., the diagonal line in figure 1) that has changed dramatically over the years, evolving from mainframes and mini-computers in the 1980s and client/server in the 1990s to the Web and Web services in the early 2000s. Today, we see the advent of mobile devices and cloud-based platforms. Each change in the underlying compute platform has created opportunities for upstarts with new technology to grab market share and forced incumbents to respond in kind or acquire the upstarts. With an endless wave of new companies pushing innovative new technologies, the BI market has been one of the most dynamic in the software industry during the past 20 years.

BI Tools. Prior to 1990, companies built reports using 3GL and 4GL reporting languages, such as Focus and Ramis. In the 1990s, vendors began selling desktop or client/server tools that enabled business users to create their own reports and analyses. The prominent BI tools were Windows-based OLAP, ad hoc query, and ad hoc reporting tools, and, of course, Excel, which still is the most prevalent reporting and analysis tool in the market today.

In the 2000s, BI vendors "rediscovered" reporting, having been enraptured with analysis tools in the 1990s. They learned the hard way that only a fraction of users want to analyze data and the real market for BI lies in delivering reports, and subsequently, dashboards, which are essentially visual exception reports. Today, vendors have moved to the next wave of BI, which is predictive analytics, while offering support for new channels of delivery (mobile and cloud.) In the next five years, I believe BI search will become an integral part of a BI portfolio, since it provides a super easy interface for casual users to submit ad hoc queries and navigate data without boundaries.

BI Vendor Messaging. In the 1990s, vendors competed by bundling together multiple types of BI tools (reporting, OLAP, query) into a single "BI Suite." A few years later, they began touting "BI Platforms" in which once distinct BI tools in a suite became modules within a unified BI architecture that all use the same query engine, charting engine, user interface, metadata, administration, security model, and application programming interface. In the late 1990s, Microsoft launched the movement towards low-cost BI tools geared to the mid-market when it bundled its BI and ETL tools in SQL Server at no extra charge. Today, a host of low-cost BI vendors, including open source BI tools, cloud-BI tools, and in-memory visual analysis tools have helped bring BI to the mid-market and lower the costs of departmental BI initiatives.

Today, BI tools have become easier to use and tailored to a range of information consumption styles (i.e., viewer interactor, lightweight author, professional author). Consequently, the watchword is now "self-service BI" where business users meet their own information requirements rather than relying on BI professionals or power users to build reports on their behalf. Going forward, BI tools vendors will begin talking about "embedded BI" in which analytics (e.g. charts, tables, models) are embedded in operational applications and mission-critical business processes.
Data Integration Tools. In the data integration market, Informatica and Ascential Software (now IBM) led the charge towards the use of extract, transform, and load (ETL) engines to replace hand-coded programs that move data from source systems to a data warehouse. The engine approach proved superior to coding because its graphical interface meant you didn't have to be a hard-core programmer to write ETL code and, more importantly, it captured metadata in a repository instead of burying it in code.

But vendors soon discovered that ETL tools are only one piece of the data integration puzzle and, following the lead of their BI brethren, moved to create data integration "suites" consisting of data quality, data profiling, master data management, and data federation tools. Soon, these suites turned into data integration "platforms" running on a common architecture. Today, the focus is on using data federation tools to "virtualize" data sources behind a common data services interface and cloud-based data integration tools to migrate data from on premises to the cloud and back again. Also, data integration vendors are making their tools easier to use, thanks in large part to cloud-based initiatives, which now has them evangelizing the notion of "self-service data integration" in which business analysts, not IT developers, build data integration scripts.

DBMS Engines and Hardware. Throughout the 1990s and early 2000s, the database and hardware markets were sleepy tidewaters in the BI market, despite the fact that they consumed a good portion of BI budgets. True, database vendors had added cubing, aggregate aware optimizers, and various types of indexes to speed query performance, but that was the extent of the innovation.

But in the early 2000s, as data warehouse data volumes began to exceed the terabyte mark and query complexity grew, many data warehouses hit the proverbial wall. Meanwhile, Moore's law continued to make dramatic strides in the price-performance of processing, storage, and memory, and soon a few database entrepreneurs spotted an opportunity to overhaul the underlying BI compute infrastructure.

Netezza opened the flood gates in 2002 with the first data warehousing appliance (unless you count Teradata back in the 1980s!) that soon gained a bevy of imitators, offering orders of magnitude better query performance for a fraction of the cost. These new systems offer innovative new storage-level filtering, column-based compression and storage, massively parallel processing architecture, expanded use of memory-based caches, and in some cases, use of solid state disk, to bolster performance and availability for analytic workloads. Today, these "analytic platforms" are turbo-charging BI deployments, and in many cases, enabling BI professionals to deliver solutions that weren't possible before.

As proof of the power of these new purpose-built analytical systems, the biggest vendors in high-tech have invaded the market, picking off leading pureplays before they've even fully germinated. In the past nine months, Microsoft, IBM, Hewlett Packard, Teradata, SAP, and EMC purchased analytic platform vendors, while Oracle built its own with hardware from Sun Microsystems, which it acquired in 2009. (See "Jockeying for Position in the Analytic Platform Market.")

Mainstream Market. When viewed as a whole, the BI market has clearly emerged from an early adopter phase to the early mainstream. The watershed mark was 2007 when the biggest software vendors in the world--Oracle, SAP, and IBM--acquired the leading BI vendors--Hyperion, Business Objects, and Cognos respectively. Also, the plethora of advertisements about BI capabilities that appear on television (e.g., IBM's Smarter Planet campaign) and major consumer magazines (e.g. SAP and SAS Institute ads) reinforce the maturity of BI as a mainstream market. BI is now front and center on the radar screen of most CIOs, if not CEOs, who want to better leverage information to make smarter decisions and gain a lasting competitive advantage.

The Future. At this point, some might wonder if there is much headroom left in the BI market. The last 20 years have witnessed a dizzying array of technology innovations, products, and methodologies. It can't continue at this pace, right? Yes and no. The BI market has surprised us in the past. Even in recent years as the BI market consolidated--with big software vendors acquiring nimble innovators--we've seen a tremendous explosion of innovation. BI entrepreneurs see a host of opportunities, from better self-service BI tools that are more visual and intuitive to use to mobile and cloud-based BI offerings that are faster, better, and cheaper than current offerings. Search vendors are making a play for BI as well as platform vendors that promise data center scalability and availability for increasingly mission-critical BI loads. And we still need better tools and approaches for querying and analyzing unstructured content (e.g., documents, email, clickstream data, Web pages) and deliver data faster as our businesses increasingly compete on velocity and as our data volumes become too large to fit inside shrinking batch windows.

Next week, Beye Research will publish a report of mine that describes a new BI Delivery Framework for the next ten years. In that report, I describe a future state BI environment that contains not just one intelligence (i.e., business intelligence) but four intelligences (e.g. analytic, continuous, and content intelligence) that BI organizations will need to support or interoperate with in the near future. Stay tuned!


Posted March 18, 2011 2:29 PM
Permalink | No Comments |

When I talk to audiences about performance dashboards, I'm invariably asked how to get executives and analysts to abandon eye-bending spreadsheets packed with hundreds of data points per page in favor of more visual displays populated with charts and graphics.

This is a complex question because the answer involves sorting out and addressing two intertwined issues: 1) people's inherent resistance to change and 2) the quality of the visual design. In the end, users will adopt a visual display of quantitative information if it adds informational value, not because of its visual appeal.

Change Management

No Surprises. As I argued in a recent blog on change management, change disrupts people's rhythm and makes them less productive in the short run. Some will become anxious and lash out against the change even when it is in their long-term best interests.

Any time you change the way you present data to users, you force them to work harder--at least in the short term. They need to spend additional time learning the new layout, double checking numbers, experimenting with new functions and filters, and so on. Most people resist this short-term hit to their productivity. To avoid a mutiny, you need to prepare users for the change way ahead of time. If they know what's coming, they can budget their time and resources accordingly. No one likes a surprise!

Classes of Users. You also need to understand the classes of business users affected by the change and how each might react. Executives will react differently than managers who will react differently than analysts and operational workers. You need to develop a change management strategy that addresses the concerns and issues of each group and provides the appropriate levels of training and support.

For example, there will always be a small group of users who will resist change at all costs. These folks need high-touch training and support. That means offering one-on-one training (especially if they are executives). It also means duplicating the old environment inside the new one. If they are used to viewing spreadsheets, you need to show them how to use the new system to export a spreadsheet to their desktop that contains the same data and layouts as the old environment. Gradually, you can wean them off the older views by showing them how the new environment can make them more productive. But this takes a lot of patience and hand-holding.

Prototyping. It's also important to manage expectations and get users to buy into the new environment. The best way to do that is to solicit user feedback on an initial prototype. For example, at 1-800 Contacts, an online provider of contact lenses that I profiled in the second edition of my book, Performance Dashboards: Measuring, Monitoring, and Managing Your Business, the BI team built an executive dashboard to monitor sales and orders every 15 minutes. It discovered that some executives preferred to see data as line charts, while others wanted gauges and others preferred tables. So the team displayed all three graph types on the one-page dashboard to address every executive's visual preferences. This simple dashboard is very effective in helping executives keep their fingers on the pulse of the business.

Be careful during prototyping not to abdicate responsibility for the design of the environment. While it's important to get user feedback, it's critical that you let designers skilled in the visual display of quantitative data establish the look and feel of the dashboard--the fonts, colors, layouts, and navigational cues. Once you've established the framework, then ask users to comment on the value of data and metrics displayed, the navigational flow of the environment, and its ease of use.

Visual Design Techniques

The second major impediment to adoption is that many visual displays of quantitative information are suboptimally designed. The displays make it harder, not easier, for users to glean the meaning of depicted data. Users must spend more time examining the visual display to spot problems, trends, and outliers and make relevant comparisons.

Stephen Few, a renowned visualization expert, cites many visual design faux pas in his books and articles. These range from the overuse of color to 3-D chart types and poorly labeled graphs. His maxim, "Make every pixel count" is a wise one. One key issue that is often overlooked, however, is the need to tailor and evolve the density of visual displays.

Sparsity versus Density. Sparse displays contain fewer information objects than dense visual displays. As a result, sparse displays are easier to use because users can absorb relevant information quickly. However, sparse displays risk containing too little information. If users have to click several times to view information that they could previously view on a single page, they will get frustrated and stop using the tool.

On the other hand, dense displays can overwhelm or intimidate users with too much information at once. Users may feel reluctant to learn the new environment and revert to prior methods of consuming information.

When deciding how many objects to put on a single screen, it's important to remember that each class of users (and individuals within each class) will fall on a different point in the sparsity-to-density spectrum. Tailoring displays to these preferences can significantly aid adoption. It's also important to recognize that user preferences change over time. A simple, sparse screen may suffice when users first begin using a new display, but as they become more familiar with the layout, the data, and functionality of the new environment, they will demand denser displays. As a result, you will need to continuously evolve your visual displays to keep pace with the visual IQ of your users.

Summary

The bottom line: users want information and answers, not pretty pictures. They will reject graphical displays that obscure the meaning of the data and make them work harder no matter how visually appealing the interface. Conversely, they will overcome their natural resistance to change and adopt visual tools that make it easier to spot trends, problems, and outliers than existing methods.

If you focus on the data, not the pictures, and manage change wisely, you will succeed in converting your users to new visual methods for viewing quantitative information.


Posted January 20, 2011 1:37 PM
Permalink | No Comments |