We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: Wayne Eckerson Subscribe to this blog's RSS feed!

Wayne Eckerson

Welcome to Wayne's World, my blog that illuminates the latest thinking about how to deliver insights from business data and celebrates out-of-the-box thinkers and doers in the business intelligence (BI), performance management and data warehousing (DW) fields. Tune in here if you want to keep abreast of the latest trends, techniques, and technologies in this dynamic industry.

About the author >

Wayne has been a thought leader in the business intelligence field since the early 1990s. He has conducted numerous research studies and is a noted speaker, blogger, and consultant. He is the author of two widely read books: Performance Dashboards: Measuring, Monitoring, and Managing Your Business (2005, 2010) and The Secrets of Analytical Leaders: Insights from Information Insiders (2012).

Wayne is founder and principal consultant at Eckerson Group,a research and consulting company focused on business intelligence, analytics and big data.

Recently in Analytics Category

Editor's note: This is part II in a multi-part series on analytics.

Perhaps your executives have read the articles and books that testify to the transformative power of analytics. Or maybe they have been impressed by IBM's "Smarter Planet" television ads that provide concrete examples of how companies can harness information to make smarter, faster decisions that dramatically improve operations and outcomes. As a result, your executives want to do analytics, and they've asked you to lead the initiative.

Your first question might be: Where do we start? Or, more specifically, where does it make sense to apply analytics in our organization?

To answer this question, the first step is to define analytics. Assuming that an organization already has a data warehouse and reporting and lightweight analysis tools, analytics refers to the use of machine-learning tools to surface trends and patterns in large volumes of data. (See "What is Analytics?") Some call this class of tools data mining, predictive analytics, optimization, or advanced analytics. For the purpose of this article, I will use the term "advanced analytics" to describe these tools and techniques.

The second, and most important step to answer the question, is to recognize that there are three main reasons why organizations implement advanced analytics: 1) big data 2) big constraints and 3) big opportunities. Let's address each driver.

Big Data

If you have small amounts of data, you don't need sophisticated machine learning tools and algorithms to identify patterns, associations, trends, and outliers in the data. You can probably eyeball relevant trends by applying simple statistical functions (e.g., min/max, mean, and median) or graphing the data as histograms or simple charts. Taking it one step further, you might want to dimensionalize the data and use OLAP tools or in-memory visual analysis tools to navigate across dimensions and down hierarchies using various grids, graphs, and filters. All these techniques are largely deductive in nature--you first need to know where and how to look before you can find relevant trends and patterns.

But with massive amounts of data, just hunting for patterns using ad hoc analytical tools may prove fruitless or become too unwieldy. Plus, once you detect a pattern, you have no way of modeling it for reuse in other applications. This is where advanced analytical tools shine: you can give them a problem to solve and point them at a large data set; they then discover the patterns and relationships in the data, which they express as mathematical equations. You can use these equations to make strategic decisions or score new records to support just-in-time actions, such as online cross-sell offers, hourly sales forecasts, or event-driven maintenance.

How Big is Big? Although it doesn't make sense to apply advanced analytics to small data sets, it's not the volume of data that ultimately counts; it's the complexity of data.

For example, you probably don't need advanced analytics to analyze a terabyte of data that contains just two fields; all you really need is a simple calculation and a lot of horsepower. In contrast, a much smaller data set with hundreds of fields makes a much better candidate for advanced analytics. The tools' algorithms calculate the relationships among all these fields, which is nearly impossible to do with traditional reporting and analysis tools. These small data sets are often created by merging together data from dozens of different systems into a wide flat table desired by analytical modelers.

Big Constraints

Although advanced analytics helps when examining big or complex data, it's even more valuable as a method for overcoming internal constraints that prevent you from optimizing a business process. Advanced analytics can help fill the gap when you don't have enough time, money, or people to achieve success. When facing such constraints, advanced analytics can optimize or automate data-intensive processes.

For instance, a social services agency wants to decrease the number of clients affected by delinquent child support payments but it only has two social workers to call 5,000 deadbeat Dads. The agency uses advanced analytics to rank the targeted fathers by their propensity to pay if they receive a call from a social worker. Here, advanced analytics overcomes a labor constraint.

Another common constraint is money. For example, a retailer has $1 million dollars to spend on a direct mail campaign, which means it can only send its new catalog to 100,000 of its 500,000 customers. It uses advanced analytics to rank customers by their propensity purchase an item from the new catalog so it can optimize the uplift of its campaign.

Time can also be a constraint. For example, a company that leases rail cars must fix them when they break. The longer the company takes to fix the rail cars, the more money it loses and the less satisfied customers become. But deciding which repair shop to send the railcars requires considering many variables, including distances to various repair shops, the current wait time at each shop, the expertise at each shop, distance to the exit destination, additional problems that should be fixed while the railcar is in the shop, and so on. An online application that embeds dvanced analytic can consider all these variables and issue a recommendation to the dispatcher while he is still on the phone with the customer who called in the repair.

Another common constraint is lack of management oversight. For instance, a bank wants to standardize how it evaluates and approves loans across its branches. It uses advanced analytics to evaluate each loan and generate an automated recommendation for loan officers. In the same way, a Web site can use advanced analytics to generate personalized cross-sell recommendations to every customer, based on their past purchases and what other customers like them have purchased.

In short, organizations use advanced analytics to overcome built-in constraints that prevent them from optimizing data-intensive business processes.

Big Opportunity

Finally, it makes sense to apply advanced analytics when the business upside justifies the cost. Analytics requires hiring experts who have a strong working knowledge of statistics and know how to create analytical models. They also must be conversant in the business process that the organization wants to optimize and the data that supports that process. Obviously, these individuals aren't inexpensive. And the tools to support the modeling process and the hardware they run on cost money as well. So, before you undertake an analytics project, make sure that the business value justifies the upfront

Fortunately, the cost of building analytical models is declining. A decade ago, you had to hire a PhD statistician who could also write C code or SQL to create analytical models. Today, that is not necessarily true. A good business analyst with some data mining training can create a majority of the analytical models that organizations might need.

However, you still need PhD statisticians when models must be continuously updated, the degree of model accuracy has a huge impact on costs or profits, or the core business runs on analytical models. For example, PhD statisticians are often used to create analytical models for credit card marketing campaigns, fraud detection, and government intelligence.

Costs? How much does it cost to set up an analytical center of excellence? Assuming you hire a handful of analysts and purchase the requisite software and hardware, it's likely to cost about $1 million a year at a minimum. Many companies start smaller by exploiting open source data mining tools and data mining extensions to BI tools and databases. They also might send a talented analyst to training and give him a one-time project to test the approach. If the project succeeds, the organization makes a bigger, more permanent investment in the people and technology. Or they may hire a consultancy to run the initial project and train internal analysts in the tools and techniques.


It's best to apply analytics to data-intensive business processes that are sub-optimized due to built-in constraints, such as lack of time, people, money, and oversight. Also, advanced analytics only makes sense when the business upside is big enough and the data complex enough to justify the costs.

Posted November 5, 2011 5:04 PM
Permalink | No Comments |

Editors note: This is the first part in a multi-part series on analytics.

One of the hottest technology topics today is analytics. The problem with analytics is that few people agree what it is. This often happens with commonly used terms because everyone attaches a slightly meaning to them based on their needs and perspectives.

I prefer to assign two definitions to analytics to reflect the primary dimensions of the term: its industry context and its technology context. For simplicity's sake, Analytics with a capital "A" is an umbrella term representing our industry, while analytics with a small "a" refers to technology used to analyze data.

Analytics With a Capital "A"

Analytics as an umbrella term refers to the processes, technologies, and techniques that turn data into information and knowledge that drive business decisions. The cool thing about such industry definitions is that you can reuse them every five years or so. For example, I used this same definition to describe "Data Warehousing" in 1995, "Business Intelligence" in 2000, and "Performance Management" in 2005. Our industry perpetually recreates itself under a new moniker with a slightly different emphasis to expand its visibility and reenergize its base. (See "What's in a Word? The Evolution of BI Semantics.")

Today, many people use the term Analytics as a proxy for everything we do in this space, from data warehousing and data integration to reporting and advanced analytics. The most prominent person who defines Analytics this way is Tom Davenport, whose terrific Harvard Business Review articles and books on the subject have prompted many executives to pursue Analytics as a sustainable source of competitive advantage. Davenport is savvy enough to know that if he had called his book "Competing on Business Intelligence" instead of "Competing on Analytics", he would not be the industry rock star that he is today. (I personally still prefer the term "Business Intelligence" because it perfectly describes what we do: use information to make the business run more intelligently.)

Analytics with a Small "a"

This leaves the term analytics with a small "a" to describe various technologies that business people use to analyze data. This is a broad category of tools that spans everything from Excel, OLAP, and visual analysis tools on one hand, to statistical modeling and optimization tools on the other.

One way to segment analytical tools is to show how they've evolved over time, along with reporting tools. Figure 1 shows that we've had four waves of business intelligence tools since the 1980s. Specifically, there have been two waves of reporting followed by two waves of analytics. The first wave of analytics took place in the 1990s when business analysts began using ad hoc query/reporting and OLAP tools to explore historical data in a data mart or data warehouse. The second wave of analytics, which just began, involves modeling historical data to optimize the present and predict the future. Most people who talk about analytical tools today refer to this latter type.

Figure 1. Waves of BI
Waves of BI Big.jpg

Interestingly, each wave of analytics follows a wave of reporting. This makes sense if you consider that reporting tools are primarily designed for casual users, who comprise 80% of all BI users, and analytical tools are primarily designed for power users, who constitute the remaining 20%. These are two separate, but inter-related markets, which BI vendors need to address.

Deductive and Inductive Analytics

The first wave of analytics--which addresses the question "Why did it happen?"--is deductive in nature, while the second wave of analytics--which addresses the question "What will happen?"--is primarily inductive.

With deductive analytics, business users use tools like Excel, OLAP, and visual analysis tools to explore a hypothesis. They take an educated guess about what might be at the root cause of some anomaly or performance alert and then use analytical tools to explore the data and either verify or negate the hypothesis. If the hypothesis proves false, they come up with a new idea and start looking in that direction.

Inductive analytics is the opposite. Business users don't start with a hypothesis, they start with a business outcome or goal (e.g., "find the top 10% of our customers and prospects who are most likely to respond to this offer") and then gather historical data that will help them discern the answer. They then use analytics to create statistical or machine learning models of the data to answer their question. In other words, they don't start with a hypothesis, they start with the data and let the analytical tools discover the patterns and anomalies for them.

Interestingly, our industry's former umbrella terms now refer to categories of tools: data warehousing refers to analytical databases and ETL tools; business intelligence refers to query and reporting tools; and performance management refers to dashboard, scorecard, and planning tools. In time, analytics will be replaced as an umbrella term by some other industry buzzword, and the term will simply refer to deductive and inductive tools, or perhaps just one or the other.

The Value of Analytics

Now that you know what advanced analytics is, the next question is, why should you

Your chief financial officer will be glad to know that analytic applications have a higher return on investment than all other BI applications. A landmark study by IDC in 2003 showed that the median ROI for projects that incorporated predictive technologies was 145% compared to 89% for all other projects. This uplift is gained largely by optimizing business processes, making them more efficient and profitable, according to IDC.
But what kinds of questions does advanced analytics address? There are four major categories:

  1. Analyze the past. Although we mainly use deductive tools to examine past trends, advanced analytical tools model the past. Some seemingly easy questions can be maddingly difficult to answer because they involve the interaction of so many variables. These include, "Why did sales drop last quarter?"
  2. Optimize the present. Once we model past activity and understand relationships among key variables, we can harness that information to optimize current processes. For instance, an market basket model can help retailers design store layouts to maximize profits.
  3. Predict the future. By applying the model (i.e., mathematical equation) to each new record, we can guess with a reasonable degree of accuracy whether a customer may respond positively to a promotion or a transaction is fraudulent.
  4. Test assumptions. Advanced analytics can also be used to test assumptions about what drives the business. For example, prior to spending millions on a marketing campaign, an online retailer might test an assumption that customers located within one square mile of a big box competitor are more likely to churn than others.

Although advanced analytics can be applied to almost any business function, marketing seems to attract the lionshare of analytical work. Research I conducted in 2007 at The
Data Warehousing Institute shows that five of the top seven applications for advanced analytics hail from the marketing department. These include cross sell/upsell (47%), campaign management (46%), customer acquisition (41%), attrition/churn/retention (40%), and promotions (31%). (See figure 2.)

Figure 2. Most Common Applications for Advanced Analytics
Functional Apps.jpg
From Wayne Eckerson, "Predictive Analytics: Extending the Value of Your Data Warehousing Investment," TDWI, 2007. Based on 166 respondents that had implemented predictive analytics
In addition, each industry has a handful of applications that are traditional candidates for advanced analytics. (See table 1.)

Table 1. Common Analytic Applications in Various Industries
Industry table.jpg
Adapted from "Analytics at Work" by Tom Davenport, Jeanne Harris, and Robert Morison. (Wiley, 2010.)


Analytics is a hot technology these days. But like any hot technology, there are multiple definitions of what it means. Analytics with a capital "A" refers the entire domain of using information to make smarter decisions, while analytics with a small "a" refers to tools and techniques to do analysis. On the technology front, there are two major categories of tools: deductive and inductive. The latter is getting a lot of attention since it's required to optimize processes and predict future behaviors and activities.
Advanced analytics (which is more inductive in nature) offers significantly more value than other types of BI applications because it helps optimize business processes and answer questions that enable the business to analyze the past, optimize the present, predict the future, and test core assumptions. Today, marketing is the biggest user of advanced analytics technologies although its uses spread wide and far.

Posted November 5, 2011 2:41 PM
Permalink | 2 Comments |

9-19-2011 5-16-03 PM.jpg

For all the talk about analytics these days, there has been little mention of one of the most powerful techniques for analyzing data: location intelligence.

It's been said that 80% of all transactions embed a location. A sale happens in a store; a call connects people in two places; a deposit happens in a branch; and so on. When we plot objects on a map, including business transactions and metrics, we can see critical patterns with a quick glance. And if we explore relationships among spatial objects imbued with business data, we can analyze data in novel ways that help us make smarter decisions more quickly.

For instance, a location intelligence system might enable a retail analyst working on a marketing campaign to identify the number of high-income families with children who live within a 15-minute drive of a store. An insurance company can assess its risk exposure from policy holders who live in a flood plain or within the path of a projected hurricane. A sales manager can visually track the performance of sales territories by products, channels, and other dimensions.

Geographic Information Systems. Location intelligence is not new. It originated with cartographers and mapmakers in the 19th and 20th century and went digital in the 1980s. Companies, such as Esri, MapInfo, and Intergraph, offer geographic information systems (GIS) which are designed to capture, store, manipulate, analyze, manage, and present all types of geographically referenced data. If this sounds similar to business intelligence, it is.

Unfortunately, GIS have evolved independently from BI systems. Even though both groups analyze and visualize data to help business users make smarter decisions, there has been little cross-pollination between the groups and little, if any, data exchange between systems. This is unfortunate since GIS analysts need business data to provide context to spatial objects they define, and BI users benefit tremendously from spatial views of business data.

Convergence of GIS and BI

However, many people now recognize the value of converging GIS and BI systems. This is partly due to the rise in popularity of Google Maps, Google Earth, global positioning systems, and spatially-aware mobile applications that leverage location as a key enabling feature. These consumer applications are cultivating a new generation of users who expect spatial data to be a key component of any information delivery system. And commercial organizations are jumping on board, led by industries that have been early adopters of GIS, including utilities, public safety, oil and gas, transportation, insurance, government, and retail.

The range of spatially-enabled BI applications are endless and powerful. "When you put location intelligence in front of someone who has never seen it before, it's like a bic light to a caveman," says Steve Trammel, head of corporate alliances and IT marketing at ESRI.

Imagine this: an operations manager at an oil refinery will soon be able to walk around a facility and view alerts based on his proximity to under-performing processing units. His mobile device shows a map that depicts the operating performance of all processing units based on his current location. This enables him to view and troubleshoot problems first-hand rather than being tethered to a remote control room. (See figure 1.)

Figure 1. Mobile Location Intelligence.
9-19-2011 4-49-49 PM.jpg

A spatially-aware mobile BI application configured by Transpara for an oil refinery in Europe. Transpara is a mobile BI vendor that recently announced integration with Google Maps.

GIS Features. Unlike BI systems, GIS specialize in storing and manipulating spatial data, which consists of points, lines, and polygons. A line is simply the intersection of two points, and a polygon is the intersection of three or more points. Each point or object can be imbued with various properties or rules that govern its behavior. For example, a road (i.e., a line) has a surface condition and a speed limit, and the only points that can be located in the middle of the road are traffic lights. In many ways, a GIS is like computer-aided design (CAD) software for spatial applications.

Most spatial data is represented as a series of X/Y coordinates that can be plotted to a map. The most common coordinate system is latitude and longitude, which enables mapmakers to plot objects on geographical maps. But GIS developers can create maps of just about anything, from the inside of a submarine or office building to a geothermal well or cityscape. Spatial engines can then run complex calculations against coordinate data to determine relationships among spatial objects, such as the driving distance between two cities or the shadows that a proposed skyscraper cast on surrounding buildings.

Approaches for Integrating GIS and BI

There are two general options for integrating GIS and BI systems: 1) integrate business data within GIS systems and 2) integrate GIS functionality within BI systems. GIS administrators already do the former when creating maps but their applications are very specialized. Moreover, most companies only purchase a handful of GIS licenses, which are expensive, and the tools are too complex to use for general business users.
The more promising approach, then, is to integrate GIS functionality into BI tools, which have a broader audience. There are several ways to do this, which vary greatly by level of GIS functionality supported.

  • BI Map Templates. Most BI tools come with several standard map images, such as a global view with country boundaries or a North American view with state boundaries. A report designer can place a map in a report, link it to a standard "geography" dimension in the data (e.g. "state" field), and assign a metric to govern the shading of boundaries. For example, a report might contain a color-coded map of the U.S. that shows sales by state. This is the most elementary form of GIS-BI integration since these out-of-the box map templates are not interactive.
  • BI Mashups. An increasingly popular approach is to integrate a BI tool with a GIS Web service, such as those provided by Google or Microsoft Bing. Here the BI tool integrates static and interactive maps via a Web service (e.g., REST API) or a software development toolkit. An AJAX or other Web client renders the map and any geocoded KPIs or objects in the data. (Geocoding assigns a latitude and longitude to a data object.) End users can then pan and zoom on the maps as well as hover over map features to view their properties and click to view underlying data. (See figure 1 above.) This approach requires a developer to write Javascript or other code.
  • GIS Mashups. GIS mashups are similar to BI mashups above but go a step further because they integrate with a full-featured GIS server, either on premise or via a Web service. Here, a BI tool embeds a special GIS connector that integrates with a mapping server and gives the report developer a point-and-click interface to integrate interactive maps with reports and dashboards. In this approach, the end-user gains additional functionality, such as the ability to interact with custom maps created by inhouse GIS specialists and "lasso" features on a map and use those selections to query or filter other objects in a report or dashboard. Some vendors, such as Information Builders and MicroStrategy built custom interfaces to GIS products, while other vendors, such as IBM Cognos and SAP BusinessObjects, embed third party software connectors (e.g., SpotOn and APOS respectively.)
  • GIS-enabled Databases. Although GIS function like object-relational databases, they store data in relational format. Thus, there is no reason that companies can't store spatial data in a data warehouse or data mart and make it available to all users and applications that need it. Many relational databases, such as Oracle, IBM DB2, Netezza, and Teradata, support spatial data types and SQL extensions for querying spatial data. Here both BI systems and GIS can access the same spatial data set, providing economies of scale, greater data consistency, and broader adoption of location intelligence functionality. However, you will still need a map server for spatial presentation.


As visual analysis in all shapes and forms begins to permeate the world of BI, it's important to begin thinking about how to augment your reports and dashboards with location intelligence. Here are a few recommendations to get you started:

  1. Identify BI applications where location intelligence could accelerate user consumption of information and enhance their understanding of underlying trends and patterns.
  2. Explore GIS capabilities of your BI and data warehousing vendor to see if they can support the types of spatial applications you have in mind.
  3. Identify GIS applications that already exist in your organization and get to know the people who run them.
  4. Investigate Web-based mapping services from GIS vendors as well as Google and Bing since this obviates the need for an inhouse GIS.
  5. Start simply, by using existing geography fields in your data (e.g., state, county, and zip) to shade the respective boundaries in a baseline map based on aggregated metric data
  6. Combine spatial and business data in a single location, preferably your data warehouse so you can deliver spatially-enabled insights to all business users.
  7. Geocode business data, including customer records, metrics, and other objects, that you might want to display on a map.

Location intelligence is not new but it should be a key element in any analytics strategy. Adding location intelligence to BI applications not only makes them visually rich, but surfaces patterns and trends not easily discerned in tables and charts.

Posted September 19, 2011 2:51 PM
Permalink | 1 Comment |

One of the unwritten jobs of an industry analyst is to define industry terms. This is risky business because no matter what you say, most people will disagree.

Our industry (and most industries) has a semantics problem. The most commonly used terms are always the most abused semantically. Everyone creates definitions that align with their individual perspectives. This is especially true among software vendors which must ensure that definitions harmonize with their product portfolios.

One of the more popular terms in recent years is analytics. The root of the word is "analysis" or "analyze". Technically, to analyze something is to break it into its constituent parts. A less formal definition is to examine something critically to understand its essence or identify causes and key factors.

Who better then to define analytics than an industry "analyst"? We presumably spend every day "thinking critically" about software and vendors. (This is also a wonderful way to justify a liberal arts education, whose primary mission is to teach students to think critically.)

To increase my chances of gaining consensus, I'm offering two definitions of analytics. (Yes, this is wishy washy, but bear with me.) We need two definitions because every commonly used industry term has two major dimensions: an industry context and a technology context.

So, given this context, Analytics with a capital "A" is an umbrella term that represents our industry at a macro level, and analytics with a small "a" refers to technology used to analyze data.

Capital Analytics

From a macro perspective, Analytics is the processes, technologies, and best practices that turns data into information and knowledge that drives business decisions and actions.

The cool thing about such industry definitions is that you can reuse them every five years or so. (For example, I used the same definition to define "Data Warehousing" in 1995, "Business Intelligence" in 2000, and "Performance Management" in 2005.) Our industry perpetually recreates itself under a new moniker with a slightly different emphasis to expand its visibility and reenergize its base. (See my blog "What's in a Word? The Evolution of BI Semantics.")

Today, many people use the term Analytics as a proxy for everything we do in this space. The most prominent person who defines Analytics this way is Tom Davenport, whose Harvard Business Review articles and books on the subject have prompted many executives to pursue Analytics as a sustainable source of competitive advantage. Davenport is savvy enough to know that if he had called his book "Competing on Business Intelligence" instead of "Competing on Analytics", he would not be the industry rock star that he is today. (I still prefer the term "Business Intelligence" because it perfectly describes what we do: use information to make the business run more intelligently.)

Small Analytics

This leaves the term "analytics" to describe various technologies that business people use to analyze data. This is a broad category of tools that spans everything from Excel, OLAP, and visual analysis tools to statistical modeling and optimization tools. There is a natural divide within these technologies so I'm tempted to create two sub-definitions: deductive analytics and inductive analytics.

(Interestingly, all of our former capitalized terms now refer to a category of tools: data warehousing refers to data modeling and ETL tools; business intelligence refers to query and reporting tools; and performance management refers to dashboard, scorecard, and planning tools.)

Deductive and Inductive Analytics

With deductive analytics, business users use tools like Excel, OLAP, and visual analysis tools to explore a hypothesis. FIrst, they make an educated guess about what might be the root cause of some anomaly or performance alert. Then, they use analytical tools to explore the data and either verify or negate the hypothesis. If the hypothesis proves false, they come up with a new hypothesis and start looking in that direction.

Inductive analytics is the opposite. Business users don't start with a hypothesis, they start with a business outcome or goal, such as: "What are the top 10% of our customers and prospects who are most likely to respond to this offer." Then, they gather historical data that they think might correlate with the desired behavior. They then use analytics to create statistical or machine learning models that they can apply to data to prioritize their customers. In other words, they don't start with a hypothesis, they start with the data and let the analytical tools discover the patterns and anomalies for them.


As the saying goes, there are many ways to skin a cat. Although I've offered two definitions of analytics (or Analytics), you are welcome to define it however you want. And you probably already have.

But remember, words are very powerful. They are our primary modes of communication. The more people you can get to use the same meaning of things, the more power you have to communicate and get things done. (So I hope you use my definitions!)

Posted July 8, 2011 10:29 AM
Permalink | 6 Comments |

In a recent blog ("What's in a Word: The Evolution of BI Semantics"), I discussed the evolution of BI semantics and end-user approaches to business intelligence. In this blog, I will focus on technology evolution and vendor messaging.

Four Market Segments. The BI market is comprised of four sub-markets that have experienced rapid change and growth since the 1990s: BI tools, data integration tools, database management systems (DBMS), and hardware platform. (See bottom half of figure 1.)

Thumbnail image for Thumbnail image for BI Market Evolution.jpg

Compute Platform. BI technologies in these market segments run on a compute infrastructure (i.e., the diagonal line in figure 1) that has changed dramatically over the years, evolving from mainframes and mini-computers in the 1980s and client/server in the 1990s to the Web and Web services in the early 2000s. Today, we see the advent of mobile devices and cloud-based platforms. Each change in the underlying compute platform has created opportunities for upstarts with new technology to grab market share and forced incumbents to respond in kind or acquire the upstarts. With an endless wave of new companies pushing innovative new technologies, the BI market has been one of the most dynamic in the software industry during the past 20 years.

BI Tools. Prior to 1990, companies built reports using 3GL and 4GL reporting languages, such as Focus and Ramis. In the 1990s, vendors began selling desktop or client/server tools that enabled business users to create their own reports and analyses. The prominent BI tools were Windows-based OLAP, ad hoc query, and ad hoc reporting tools, and, of course, Excel, which still is the most prevalent reporting and analysis tool in the market today.

In the 2000s, BI vendors "rediscovered" reporting, having been enraptured with analysis tools in the 1990s. They learned the hard way that only a fraction of users want to analyze data and the real market for BI lies in delivering reports, and subsequently, dashboards, which are essentially visual exception reports. Today, vendors have moved to the next wave of BI, which is predictive analytics, while offering support for new channels of delivery (mobile and cloud.) In the next five years, I believe BI search will become an integral part of a BI portfolio, since it provides a super easy interface for casual users to submit ad hoc queries and navigate data without boundaries.

BI Vendor Messaging. In the 1990s, vendors competed by bundling together multiple types of BI tools (reporting, OLAP, query) into a single "BI Suite." A few years later, they began touting "BI Platforms" in which once distinct BI tools in a suite became modules within a unified BI architecture that all use the same query engine, charting engine, user interface, metadata, administration, security model, and application programming interface. In the late 1990s, Microsoft launched the movement towards low-cost BI tools geared to the mid-market when it bundled its BI and ETL tools in SQL Server at no extra charge. Today, a host of low-cost BI vendors, including open source BI tools, cloud-BI tools, and in-memory visual analysis tools have helped bring BI to the mid-market and lower the costs of departmental BI initiatives.

Today, BI tools have become easier to use and tailored to a range of information consumption styles (i.e., viewer interactor, lightweight author, professional author). Consequently, the watchword is now "self-service BI" where business users meet their own information requirements rather than relying on BI professionals or power users to build reports on their behalf. Going forward, BI tools vendors will begin talking about "embedded BI" in which analytics (e.g. charts, tables, models) are embedded in operational applications and mission-critical business processes.
Data Integration Tools. In the data integration market, Informatica and Ascential Software (now IBM) led the charge towards the use of extract, transform, and load (ETL) engines to replace hand-coded programs that move data from source systems to a data warehouse. The engine approach proved superior to coding because its graphical interface meant you didn't have to be a hard-core programmer to write ETL code and, more importantly, it captured metadata in a repository instead of burying it in code.

But vendors soon discovered that ETL tools are only one piece of the data integration puzzle and, following the lead of their BI brethren, moved to create data integration "suites" consisting of data quality, data profiling, master data management, and data federation tools. Soon, these suites turned into data integration "platforms" running on a common architecture. Today, the focus is on using data federation tools to "virtualize" data sources behind a common data services interface and cloud-based data integration tools to migrate data from on premises to the cloud and back again. Also, data integration vendors are making their tools easier to use, thanks in large part to cloud-based initiatives, which now has them evangelizing the notion of "self-service data integration" in which business analysts, not IT developers, build data integration scripts.

DBMS Engines and Hardware. Throughout the 1990s and early 2000s, the database and hardware markets were sleepy tidewaters in the BI market, despite the fact that they consumed a good portion of BI budgets. True, database vendors had added cubing, aggregate aware optimizers, and various types of indexes to speed query performance, but that was the extent of the innovation.

But in the early 2000s, as data warehouse data volumes began to exceed the terabyte mark and query complexity grew, many data warehouses hit the proverbial wall. Meanwhile, Moore's law continued to make dramatic strides in the price-performance of processing, storage, and memory, and soon a few database entrepreneurs spotted an opportunity to overhaul the underlying BI compute infrastructure.

Netezza opened the flood gates in 2002 with the first data warehousing appliance (unless you count Teradata back in the 1980s!) that soon gained a bevy of imitators, offering orders of magnitude better query performance for a fraction of the cost. These new systems offer innovative new storage-level filtering, column-based compression and storage, massively parallel processing architecture, expanded use of memory-based caches, and in some cases, use of solid state disk, to bolster performance and availability for analytic workloads. Today, these "analytic platforms" are turbo-charging BI deployments, and in many cases, enabling BI professionals to deliver solutions that weren't possible before.

As proof of the power of these new purpose-built analytical systems, the biggest vendors in high-tech have invaded the market, picking off leading pureplays before they've even fully germinated. In the past nine months, Microsoft, IBM, Hewlett Packard, Teradata, SAP, and EMC purchased analytic platform vendors, while Oracle built its own with hardware from Sun Microsystems, which it acquired in 2009. (See "Jockeying for Position in the Analytic Platform Market.")

Mainstream Market. When viewed as a whole, the BI market has clearly emerged from an early adopter phase to the early mainstream. The watershed mark was 2007 when the biggest software vendors in the world--Oracle, SAP, and IBM--acquired the leading BI vendors--Hyperion, Business Objects, and Cognos respectively. Also, the plethora of advertisements about BI capabilities that appear on television (e.g., IBM's Smarter Planet campaign) and major consumer magazines (e.g. SAP and SAS Institute ads) reinforce the maturity of BI as a mainstream market. BI is now front and center on the radar screen of most CIOs, if not CEOs, who want to better leverage information to make smarter decisions and gain a lasting competitive advantage.

The Future. At this point, some might wonder if there is much headroom left in the BI market. The last 20 years have witnessed a dizzying array of technology innovations, products, and methodologies. It can't continue at this pace, right? Yes and no. The BI market has surprised us in the past. Even in recent years as the BI market consolidated--with big software vendors acquiring nimble innovators--we've seen a tremendous explosion of innovation. BI entrepreneurs see a host of opportunities, from better self-service BI tools that are more visual and intuitive to use to mobile and cloud-based BI offerings that are faster, better, and cheaper than current offerings. Search vendors are making a play for BI as well as platform vendors that promise data center scalability and availability for increasingly mission-critical BI loads. And we still need better tools and approaches for querying and analyzing unstructured content (e.g., documents, email, clickstream data, Web pages) and deliver data faster as our businesses increasingly compete on velocity and as our data volumes become too large to fit inside shrinking batch windows.

Next week, Beye Research will publish a report of mine that describes a new BI Delivery Framework for the next ten years. In that report, I describe a future state BI environment that contains not just one intelligence (i.e., business intelligence) but four intelligences (e.g. analytic, continuous, and content intelligence) that BI organizations will need to support or interoperate with in the near future. Stay tuned!

Posted March 18, 2011 2:29 PM
Permalink | No Comments |

Search this blog
Categories ›
Archives ›
Recent Entries ›