We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: Wayne Eckerson Subscribe to this blog's RSS feed!

Wayne Eckerson

Welcome to Wayne's World, my blog that illuminates the latest thinking about how to deliver insights from business data and celebrates out-of-the-box thinkers and doers in the business intelligence (BI), performance management and data warehousing (DW) fields. Tune in here if you want to keep abreast of the latest trends, techniques, and technologies in this dynamic industry.

About the author >

Wayne has been a thought leader in the business intelligence field since the early 1990s. He has conducted numerous research studies and is a noted speaker, blogger, and consultant. He is the author of two widely read books: Performance Dashboards: Measuring, Monitoring, and Managing Your Business (2005, 2010) and The Secrets of Analytical Leaders: Insights from Information Insiders (2012).

Wayne is founder and principal consultant at Eckerson Group,a research and consulting company focused on business intelligence, analytics and big data.

Recently in Analytics Category

We all know that people, process, and technology are the keys to unlocking the business value of information technology. Although many organizations know how to setup and manage technology projects, they are less adept at setting up and managing their human resources.

Although there are no hard and fast rules about how to implement a BI Center of Excellence, top performing business intelligence programs usually adopt a common structure. After interviewing dozens of BI leaders for my recently published book, Secrets of Analytical Leaders: Insights from Information Insiders, I began to see that most BI Centers of Excellence have a tripartite structure consisting of an executive team, a business team, and a technical team. (See figure 1.)

Figure 1. BI Center of Excellence
BI CoE.jpg


The executive team consists of line of business heads who sponsor and fund BI projects for their units. Also known as an executive committee or steering group, the executive team usually meets monthly to start and quarterly once the BI program gets established. Although its ostensible job is to review and approve the BI roadmap, allocate funds, and prioritize development, its primary purpose is to manage the politics that surround any successful BI program that proves it can deliver business value quickly. An executive team earns it pay by balancing the parochial interests of its individual members with the global interests of the company. By leading political interference, the executive team frees the BI team to work free of internecine distractions.


BI director. The business team is run by a director of BI (or analytics) who oversees the entire BI program, ranging data warehousing to business intelligence to analytics and big data. The BI director sits outside of the IT department and reports to a C-level business executive, usually a COO, CFO, or CIO. This reporting structure is critical to ensure the success of the BI program. Unlike other types of information technologies, BI needs to straddle business and IT to ensure it aligns with ever-changing business requirements.

A BI Center of Excellence requires a strong and capable leader to succeed. My book details the characteristics of such leaders. In essence, they must be "purple people"--not "blue" in the business or "red" in IT, but a perfect blend of the two, hence "purple." They talk the language of both worlds and build bridges that eliminate the "us versus them" mentality that exists in many organizations. For instance, they excel at creating teams of business and technical people who sit side by side and work together to deliver rich BI solutions. In short, these leaders are the glue that binds all the components of a BI Center of Excellence together.

BOBI. Assisting the BI director are several business-oriented BI (BOBI) professionals. The primary purpose of this BOBI team is to develop and evangelize the BI strategy and coordinate its development with the BI technical team (see below). The BOBI team identifies people doing BI work in business units and establishes relationships with them. It often recruits them to serve on a BI working committee that serves as an extension to the BOBI team and helps develop the BI strategy, troubleshoot problems, select tools, and manage the company's report porfolio. In addition, the BOBI team defines and documents BI best practices, oversees data governance programs, and gathers requirements for major BI projects.

Embedded developers and analysts. The business team also consists of report developers (i.e. super users) and business analysts who work inside a business department. These embedded developers and analysts sit with business people, participate in all their meetings, and are considered full-fledged members of the business team. Although these developers and analysts usually report to the line-of-business head, they usually have a dotted line relationship to the BI director and meet regularly with their peers in other business units to share ideas and collaborate on cross-departmental issues or initiatives. These may be the same individuals who serve on the BOBI team's working committee described above.

Statisticians. The business team also consists of statisticians (or data scientists) who develop analytical models that describe patterns in large data sets and predict outcomes. In small organizations, statisticians typically reside in a central group since individual departments usually don't have enough work to keep a statistician busy all the time. In large organizations, statisticians are typically embedded in departments but report directly to a director of analytics. Even moreso than business analysts, statisticians need an affiliation with a central group that fosters collaboration, continuing education, and career development.


The technical BI team consists of data and technical architects, ETL and BI developers, data and DW administrators, requirements specialists, quality assurance testers, trainers, and technical writers, among others. These folks are responsible for implementing the strategy established by the BOBI team and its departmental surrogates. In essence, the technical BI team builds and maintains the organization's enterprise data warehouse and associated data marts as well as any complex reports and dashboards that require skilled programmers. It also implements data definitions and rules within BI tools, data models, ETL tools, and data quality tools and works closely with data center specialists, such as database administrators, to ensure the BI environment delivers adequate scalability and performance.

Like the BOBI team, the technical BI team sits outside of the IT department and reports to the director of BI. Occasionally, however, the technical team resides within IT while the BOBI team resides outside of it. To succeed with this type of hybrid structure requires that the director of BI and the director of IT maintain a close working relationship with constant communication.


Although there are infinite ways to organize a BI team, best-in-class organizations develop a tripartite organizational structure consisting of executive, business, and technical teams. The director of BI (or analytics) is the glue that holds these three teams together and must possess strong business and technical skills. Ideally, the business and technical teams reside outside of IT to align more closely with the business. These teams ensure further alignment by embedding report developers and analysts (and sometimes statisticians) within business departments. However, to ensure continuity and cross-departmental coordination, these embedded developers and analysts also maintain a reporting relationship with the director of BI and often serve on a BOBI working committee that supports BI deployment for the entire organization.

Author's Note: If you would like more information about how to organize and motivate BI and analytical professionals, my book contains several chapters on these topics: Secrets of Analytical Leaders: Insights from Information Insiders

Posted November 14, 2012 11:22 AM
Permalink | 5 Comments |

(Note: This is the sixth and final article in a series on advanced analytics.)

Model-making is at the heart of advanced analytics. Thankfully, few of us need to create analytical models or learn the statistical techniques upon which they're based. However, any self-respecting business intelligence (BI) professional needs to understand the modeling process so he can better support the data requirements of analytical modelers.

Analytical Models

An analytical model is simply a mathematical equation that describes relationships among variables in a historical data set. The equation either estimates or classifies data values. In essence, a model draws a "line" through a set of data points that can be used to predict outcomes. For example, a linear regression draws a straight line through data points on a scatterplot that shows the impact of advertising spend on sales for various ad campaigns. The model's formula--in this case, "Sales=17.813 + (.0897* advertising spend)"-- enables executives to accurately estimate sales if they spend a specific amount on advertising. (See figure 1.)

Figure 1. Estimation Model (Linear Regression)
Linear regression.jpg

Algorithms that create analytical models (or equations) come in all shapes and sizes. Classification algorithms, such as neural networks, decision trees, clustering, and logistic regression, use a variety of techniques to create formulas that segregate data values into groups. Online retailers often use these algorithms to create target market segments or determine which products to recommend to buyers based on their past and current purchases. (See figure 2.)

Figure 2. Classification Algorithms

Classification models separate data values into logical groups.

Trusting Models. Unfortunately, some models are more opaque than others; that is, it's hard to understand the logic the model used to identify relevant patterns and relationships in the data. The problem with these "black box" models is that business people often have a hard time trusting them until they see quantitative results, such as reduced costs or higher revenues. Getting business users to understand and trust the output of analytical models is perhaps the biggest challenge in data mining.

To earn trust, analytical models have to validate a business person's intuitive understanding of how the business operates. In reality, most models don't uncover brand new insights; rather they unearth relationships that people understand as true but aren't looking at or acting upon. The models simply refocus people's attention on what is important and true and dispel assumptions (whether conscious or unconscious) that aren't valid.

Modeling Process

Given the power of analytical models, it's important that analytical modelers take a disciplined approach. Analytical modelers need to adhere to a methodology to work productively and generate accurate models. The modeling process consists of six distinct tasks:

  1. Define the project

  2. Explore the data

  3. Prepare the data

  4. Create the model

  5. Deploy the model

  6. Manage the model

Interestingly, preparing the data is the most time-consuming part of the process, and if not done right, can torpedo the analytical model and project. "[Data preparation] can easily be the difference between success and failure, between usable insights and incomprehensible murk, between worthwhile predictions and useless guesses," writes Dorian Pyle in his book, "Data Preparation for Data Mining."

Figure 3 shows a breakdown of the time required for each of these six steps. Data preparation consumes one-quarter (25%) of an analytical modeler's time, followed by model creation (23%), data exploration (18%), project definition (13%), scoring and deployment (12%), and model management (9%). Thus, almost half of an analytical modelers' time (43%) is spent exploring and preparing data, although this varies based on the condition and availability of data. Analytical modelers are like house painters who must spend lots of time preparing a paint surface to ensure a long-lasting paint finish.

Figure 3. Analytical Modeling Tasks
Modeling Steps.jpg

From Wayne Eckerson, "Predictive Analytics: Extending the Value of Your Data Warehousing Investment," 2007. Based on 166 respondents who have a predictive modeling practice.

Project Definition. Although defining an analytical project doesn't take as long as some of the other steps, it's the most critical task in the process. Modelers that don't know explicitly what they're trying to accomplish won't be able to create useful analytical models. Thus, before they start, good analytical modelers spend a lot of time defining objectives, impact, and scope.

Project objectives consist of the assumptions or hypotheses that a model will evaluate. Often, it helps to brainstorm hypotheses and then prioritize them based on business requirements. Project impact defines the model output (e.g., a report, a chart, or scoring program), how the business will use that output (e.g., embedded in a daily sales report or operational application or used in strategic planning), and the projected return on investment. Project scope defines who, what, where, when, why, and how of the project, including timelines and staff assignments.

For example, a project objective might be: "Reduce the amount of false positives when scanning credit card transactions for fraud." While the output might be: "A computer model capable of running on a server and measuring 7,000 transaction per minute, scoring each with probability and confidence, and routing transactions above a certain threshold to an operator for manual intervention."

Data Exploration. Data exploration or data discovery involves sifting through various sources of data to find the data sets that best fit the project. During this phase, the analytical modeler will document each potential data set with the following items:

  • Access methods: Source systems, data interfaces, machine formats (e.g. ASCII or EBCDIC), access rights, and data availability.
  • Data characteristics: Field names, field lengths, content, format, granularity and statistics (e.g. counts, mean, mode, median, and min/max values)
  • Business rules: Referential integrity rules, defaults, other business rules
  • Data pollution: Data entry errors, misused fields, bogus data
  • Data completeness: Empty or missing values, sparsity
  • Data consistency: Labels and definitions

Typically, an analytical modeler will compile all this information into a document and use it to help prioritize which data sets to use for which variables. (See figure 4.) A data warehouse with well documented metadata can greatly accelerate the data exploration phase because it also maintains much of this information. However, analytical modelers often want to explore external data and other data sets that don't exist in the data warehouse and must compile this information manually.

Figure 4. Data Profile Document
Data Profile Document.jpg
A data profile document describes the properties of a potential data set.

Data Preparation. Once analytical modelers document and select their data sets, they then must standardize and enrich the data. First, this means correcting any data errors that exist in the data and standardizing the machine format (e.g. ASCII vs EBCDIC). Then, it involves merging and flattening the data into a single wide table which may consist of hundreds of variables (i.e., columns). Finally, it means enriching the data with third party data, such as demographic, psychographic, or behavioral data that can enhance the models.

From there, analytical modelers transform the data so it's in an optimal form to address project objectives and meet processing requirements for specific machine learning techniques. Common transformations include summarizing data using reverse pivoting(See figure 5), transforming categorical values into numerical values, normalizing numeric values so they range from 0 to 1, consolidating continuous data into a finite set of bins or categories, removing redundant variables, and filling in missing values.

Modelers try to eliminate variables and values that aren't relevant as well as fill in empty fields with estimated or default values. In some cases, modelers may want to increase the bias or skew in a data set by duplicating outliers, giving them more weight in the model output. These are just some of the many data preparation techniques that analytical modelers use.

Figure 5. Reverse Pivoting
Reverse Pivoting.jpg
To model a banking "customer" not bank transactions, analytical modelers use a technique called reverse pivoting to summarize banking transactions to show customer activity by period.

Analytical Modeling. Analytical modeling is as much art as science. Much of the craft involves knowing what data sets and variables to select and how to format and transform the data for specific data models. Often, a modeler will start with 100+ variables and then, through data transformation and experimentation, winnow them down to 12 to 20 variables that are most predictive of the desired outcome.

In addition, an analytical modeler needs to select historical data that has enough of the "answers" built in it with a minimal amount of noise. Noise consists of patterns and relationships that have no business value, such as a person's birth date and age, which gives a 100 percent correlation. A data modeler will eliminate one of those variable to reduce noise. In addition, they will validate their models by testing them against random subsets of the data which they set aside in advance. If the scores remain compatible across training, testing, and validation data sets then they know they have a fairly accurate and relevant model.

Finally, the modeler must choose the right analytical techniques and algorithms or combinations of techniques to apply to a given hypothesis. This is where modelers' knowledge of business processes, project objectives, corporate data, and analytical techniques come into play. They may need to try many combinations of variables and techniques before they generate a model with sufficient predictive value.

Every analytical technique and algorithm has its strengths and weaknesses, as summarized in the tables below. The goal is to pick the right modeling technique so you have to do as little preparation and transformation as possible, according to Michael Berry and Gordon Linhoff in their book, "Data Mining Techniques: For Marketing, Sales, and Customer Support."

Table 1. Analytical Models
Table 1.jpg

Table 2. Analytical Techniques
Table 2.jpg

Deploy the Model. Model deployment takes many forms, as mentioned above. Executives can simply look at the model, absorb its insights, and use it to guide their strategic or operational planning. But models can also be operationalized. The most basic way to do operationalize a model is to embed it in an operational report. For example, a daily sales report for a telecommunications company might list each sales representative's customers by their propensity to churn. Or a model might be applied at the point of customer interaction, whether at a branch office or at an online checkout counter.

To apply models, you first have to score all the relevant records in your database. This involves converting the model into SQL or some other program that can run inside the database that holds the records that you want to score. Scoring involves running the model against each record and generating a numeric value, usually between 0 and 1, which is then appended to the record as an additional column. A higher score generally means a higher propensity to portray the desired or predicted behavior. Scoring is usually a batch process that happens at night or on the weekend depending on the volume of records that need to be scored. However, scoring can also happen in real-time, which is essentially what online retailers do when they make real-time recommendations based on purchases a customer just made.

Model Management. Once the model is built and deployed, it must be maintained. Models become obsolete over time, as the market or environment in which they operate changes. This is particularly true for volatile environments, such as customer marketing or risk management. Also, complex models that deliver high business value usually require a team of people to create, modify, update, and certify the models.

In such an environment, it's critical to have a model repository that can track versions, audit usage, and manage a model through its lifecycle. Once an organization has more than one operational model, it's imperative it implements model management utilities, which most data mining vendors now support.


Analytical models can be powerful. They can help organizations use information proactively instead of reactively. They can make predictions that streamline business processes, reduce costs, increase revenues, and improve customer satisfaction.

To create analytical models is as much art as science. A well-trained modeler needs to step through a variety of data-oriented tasks to create accurate models. Much of the heavy lifting involved in creating analytical models involves exploring and preparing the data. A well designed data warehouse or data mart can accelerate the modeling process by collecting and documenting a large portion of the data that modelers require and transforming that data into wide, flat tables conducive to the modeling process.

Posted November 29, 2011 1:42 PM
Permalink | No Comments |

The previous two articles in this series covered the organizational and technical factors required to succeed with advanced analytics. But as with most things in life, the hardest part is getting started. This final article shows how to kickstart an analytics practice and rev it into high gear.

The problem with selling an analytics practice is that most business executives who would support and fund the initiative haven't heard of the term. Some will think it's another IT boondoggle in the making and will politely deny or put off your request. You're caught in the chicken-or-egg riddle: it's hard to sell the value of analytics until you've shown tangible results. But you can't deliver tangible results until an executive buys into the program.

Of course, you may be fortunate to have enlightened executives who intuitively understand the value of analytics and are coming to you to build a practice. That's a nice fairy tale. Even with enlightened executives, you still need to prove the value of the technology and, more importantly, your ability to harness it. Even in a best-case scenario, you get one chance to prove yourself.

So, here are ten steps you can take to jumpstart an analytics practice, whether you are working at the grassroots level or working at the behest of a eager senior executive.

1. Find an Analyst. This seems too obvious to state, but it's hard to do in practice. Good analysts are hard to come by. They combine a unique knowledge of business process, data, and analytical tools. As people, they are critical thinkers who are inquisitive, doggedly persistent, and passionate about what they do. Many analysts have M.B.A. degrees or trained as social scientists, statisticians, or Six Sigma practitioners. Occasionally, you'll be able to elevate a precocious data analyst or BI report developer into the role.

2. Find an Executive. Good sponsors are almost as rare as good analysts. A good sponsor is someone who is willing to test long-held assumptions using data. For instance, event companies mail their brochures 12 weeks before every conference. Why? No one knows; it's the way it's always been done. But maybe they could get a bigger lift from their marketing investments if they mailed the brochures 11 or 13 weeks out, or shifted some of their marketing spend from direct mail to email and social media channels. A good sponsor is willing to test such assumptions.

3. Focus Your Efforts. If you've piqued an executive's interest, then explain what resources you need, if any, to conduct a test. But don't ask for much, because you don't need much to get going. Ideally, you should be able to make do with people and tools you have inhouse. A good analyst can work miracles with Excel and SQL and there are many open source data mining packages on the market today as well as low cost statistical add-ins to Excel and BI tools. Select a project that is interesting enough to be valuable to the company, but small enough to minimize risk.

4. Talk Profits. It's very important to remember that your business sponsor won't trust your computer model. They will go with their gut instinct rather than rely on a mathematical model to make a major decision. They will only trust the model if it shows either tangible lift (i.e., more revenues or profits), or it validates their own experience and knowledge. For example, the head of marketing for an online retailer will trust a market basket model if he realizes that the model has detected purchasing habits of corporate procurement officers who buy office items for new hires.

5. Act on Results. There is no point creating analytical models if the business doesn't act on them. There are many ways to make models actionable. You can present the results to executives whose go-to-market strategies might be shaped by the findings. Or you can embed the models in a weekly churn report distributed to sales people that indicates which customers are likely to attrite in the near future. (See figure 1.) Or you can embed models in operational applications so they are triggered by new events (e.g., a customer transaction) and automatically spit out recommendations (e.g., cross-sell offers.)

Figure 1. An Actionable Report
Part V - Actionable Report.jpg

6. Make it Useful. The models not only should be actionable, they should be proactive. The worst thing you can do is tell a salesperson something they already know. For instance, if the model says, "This customer is likely to churn because they haven't purchased anything in 90 days", a salesperson is likely to say, "Duh, tell me something I don't already know." A better model would be one that detects patterns not immediately obvious to the salesperson. For example, "This customer makes frequent purchases but their overall monthly expenditures have dropped ten percent since the beginning of the year."

7. Consolidate Data. Too often, analysts play the role of IT manager by accessing, moving, and transforming data before they begin analyze it. Although the DW team will never be able to identify and consolidate all the data that analysts might need, it can always do a better job understanding their requirements and making the right data available at the right level of granularity. This might require purchasing demographic data and creating specialized wide, flat tables preferred by modelers. It might also mean supporting specialized analytical functions inside the database that lets the modelers profile, prepare, and model data.

8. Unlock Your Data. Unfortunately, most IT managers don't provide analysts ready access to corporate data for fear that their SQL queries will grind an operational system or data warehouse to a halt. To balance access and performance, IT managers should create an analytical sandbox that enables modelers to upload their own data and mix it with corporate data in the warehouse. These sandboxes can be virtual table partitions inside the data warehouse or dedicated analytical machines that contain a replica of corporate data or an entirely new data set. In either case, the modelers get free and open access to data and IT managers get to worry less about resource contention.

9. Govern Your Data. Because analysts are so versatile with data, they often get pulled in multiple directions. The lowest value-added activity they perform is creating ad hoc queries for business colleagues. This type of work is better left to super users in each department. But to prevent Super Users from generating thousands of duplicate or conflicting reports, the BI team needs to establish a report governance committee that evaluates requests for new reports, maps them to an existing inventory, and decides which ones to build or roll into existing report structures. Ideally, the report governance committee is comprised of Super Users who are already creating most of the reports users use.

10. Centralize Analysts. It's imperative that analysts feel part of a team and not isolated in some departmental silo. An Analytics Center of Excellence can help build camaraderie among analysts, cross train them in different disciplines and business processes, and mentor new analysts. A director of analytics needs to prioritize analytics projects, cultivate an analytics mindset in the corporation, and maintain a close alliance with the data warehousing team. In fact, it's best if the director of analytics also has responsibility for the data warehouse. Ideally, 80% to 90% of analysts are embedded in the departments where they work side by side with business users and the rest reside at corporate headquarters where they focus on cross-departmental initiatives.


Although some of the steps defined above are clearly for novices, even analytics teams that are more advanced still struggle with many of the items. To succeed with analytics ultimately requires a receptive culture, top-notch people (i.e., analysts), comprehensive and clean data, and the proper tools. Success will not come quickly but takes a sustained effort. But the payoff, when it comes, is usually substantial.

Posted November 21, 2011 7:34 AM
Permalink | No Comments |

The prior article in this series discussed the human side of analytics. It explained how companies need to have the right culture, people, and organization to succeed with analytics. The flip side is the "hard stuff"- the architecture, platforms, tools, and data--that makes analytics possible. Although analytical technology gets the lionshare of attention in the trade press--perhaps more than it deserves for the value it delivers--it nonetheless forms the bedrock of all analytical initiatives. This article examines the architecture, platforms, tools, and data needed to deliver robust analytical solutions.


The term "analytical architecture" is an oxymoron. In most organizations, business analysts are left to their own devices to access, integrate, and analyze data. By necessity, they create their own data sets and reports outside the purview and approval of corporate IT. By definition, there is no analytical architecture in most organizations--just a hodge-podge of analytical silos and spreadmarts, each with conflicting business rules and data definitions.

Analytical sandboxes. Fortunately, with the advent of specialized analytical platforms (discussed below), BI architects have more options for bringing business analysts into the corporate BI fold. They can use these high-powered database platforms to create analytical sandboxes for the explicit use of business analysts. These sandboxes, when designed properly, give analysts the flexibility they need to access corporate data at a granular level, combine it with data that they've sourced themselves, and conduct analyses to answer pressing business questions. With analytical sandboxes, BI teams can transform business analysts from data pariahs to full-fledged members of the BI community.

There are four types of analytical sandboxes:

  • Staging Sandbox. This is a staging area for a data warehouse that contains raw, non-integrated data from multiple source systems. Analysts generally prefer to query a staging area that contains all the raw data than each source system individually. Hadoop is a staging area for large volumes of unstructured data that a growing number of companies are adding to their BI ecosystems.

  • Virtual Sandbox. A virtual sandbox is a set of tables inside a data warehouse assigned to individual analysts. Analysts can upload data into the sandbox and combine it with data from the data warehouse, giving them one place to go to do all their analyses. The BI team needs to carefully allocate compute resources so analysts have enough horsepower to run ad hoc queries without interfering with other workloads running on the data warehouse.

  • Free-standing sandbox. A free-standing sandbox is a separate database server that sits alongside a data warehouse and contains its own data. It's often used to offload complex, ad hoc queries from an enterprise data warehouse and give business analysts their own space to play. In some cases, these sandboxes contain a replica of data in the data warehouse, while in others, they support entirely new data sets that don't fit in a data warehouse or run faster on an analytical platform.

  • In-memory BI sandbox. Some desktop BI tools maintain a local data store, either in memory or on disk, to support interactive dashboards and queries. Analysts love these types of sandboxes because they connect to virtually any data source and enable analysts to model data, apply filters, and visually interact with the data without IT intervention.

Next-Generation BI Architecture. Figure 1 depicts a BI architecture with the four analytical sandboxes colored in green. The top half of the diagram represents a classic top-down, data warehousing architecture that primarily delivers interactive reports and dashboards to casual users (although the streaming/complex event processing (CEP) engine is new.) The bottom half of the diagram depicts a bottom-up analytical architecture with analytical sandboxes along with new types of data sources. This next-generation BI architecture better accommodates the needs of business analysts and data scientists, making them full-fledged members of the corporate BI ecosystem.

Figure 1. The New BI Architecture
Part IV - BI Architecture of Future.jpg

The next-generation BI architecture is more analytical, giving power users greater options to access and mix corporate data with their own data via various types of analytical sandboxes. It also brings unstructured and semi-structured data fully into the mix using Hadoop and nonrelational databases.

Analytical Platforms

Since the beginning of the data warehousing movement in the early 1990s, organizations have used general-purpose data management systems to implement data warehouses and, occasionally, multidimensional databases (i.e., "cubes") to support subject-specific data marts, especially for financial analytics. General-purpose data management systems were designed for transaction processing (i.e., rapid, secure, synchronized updates against small data sets) and only later modified to handle analytical processing (i.e., complex queries against large data sets.) In contrast, analytical platforms focus entirely on analytical processing at the expense of transaction processing.

The analytical platform movement. In 2002, Netezza (now owned by IBM), introduced a specialized analytical appliance, a tightly integrated, hardware-software database management system designed explicitly to run ad hoc queries against large volumes of data at blindingly fast speeds. Netezza's success spawned a host of competitors, and there are now more than two dozen players in the market. (see Table 1).

Table 1. Types of Analytical Platforms
Part IV - Tools Table.jpg

Today, the technology behind analytical platforms is diverse: appliances, columnar databases, in memory databases, massively parallel processing (MPP) databases, file-based systems, nonrelational databases and analytical services. What they all have in common, however, is that they provide significant improvements in price-performance, availability, load times and manageability compared with general-purpose relational database management systems. Every analytical platform customer I've interviewed has cited an order-of-magnitude performance gains that most initially don't believe.

Moreover, many of these analytical platforms contain built-in analytical functions that make life easier for business analysts. These functions range from fuzzy matching algorithms and text analytics to data preparation and data mining functions. By putting functions in the database, analysts no longer have to craft complex, custom SQL or offboard data to analytical workstations, which limits the amount of data they can analyze and model.

Companies use analytical platforms to support free-standing sandboxes (described above) or as replacements for data warehouses running on MySQL and SQL Server, and occasionally major OLTP databases from Oracle and IBM. They also improve query performance for ad hoc analytical tools, especially those that connect directly to databases to run queries (versus those that download data to a local cache.)

Analytical Tools

In 2010, vendors turned their attention to meeting the needs of power users after ten years of enhancing reporting and dashboard solutions for casual users. As a result, the number of analytical tools on the market has exploded.

Analytical tools come in all shapes and sizes. Analysts generally need one of every type of tool. Just as you wouldn't hire a carpenter to build an addition to your house with just one tool, you don't want to restrict an analyst to just one analytical tool. Like a carpenter, an analyst needs a different tool for every type of job they do. For instance, a typical analyst might need the following tools:

Excel to extract data from various sources, including local files, create reports, and share them with others via a corporate portal or server (managed Excel).
BI Search tools to issue ad hoc queries against a BI tool's metadata.
Planning tools (including Excel) to create strategic and tactical plans, each containing multiple scenarios.
Mashboards and ad hoc reporting tools to create ad hoc dashboards and reports on behalf of departmental colleagues
Visual discovery tools to explore data in one or more sources of data and create interactive dashboards on behalf of departmental colleagues
Multidimensional OLAP (MOLAP) tools to explore small and medium sets of data dimensionally at the speed of thought and run complex dimensional calculations.
Relational OLAP tools to explore large sets of data dimensionally and run complex calculations
Text analytics tools to parse text data and put it in a relational structure for analysis.
Data mining tools to create descriptive and predictive models.
Hadoop and MapReduce to process large volumes of unstructured and semi-structured data in a parallel environment.

Figure 2. Types of Analytical Tools
Part IV - Types of Tools.jpg

Figure 2 plots these tools on a graph where the x axis represents calculation complexity and the y axis represents data volumes. Ad hoc analytical tools for casual users (or more realistically super users) are clustered in the bottom left corner of the graph, while ad hoc tools for power users are clustered slightly above and to the right. Planning and scenario modeling tools cluster further to the right, offering slightly more calculation complexity against small volumes of data. High-powered analytical tools, which generally rely on machine learning algorithms and specialized analytical databases, cluster in the upper right quadrant.


Business analysts function like one-man IT shops. They must access, integrate, clean and analyze data, and then present it to other users. Figure 2 depicts the typical workflow of a business analyst. If an organization doesn't have a mature data warehouse that contains cross-functional data at a granular level, they often spend an inordinate amount of time sourcing, cleaning, and integrating data. (Steps 1 and 2 in the analyst workflow.) They then create a multiplicity of analytical silos (step 5) when they publish data, much to the chagrin of the IT department.

Figure 2. Analyst Workflow

In the absence of a data warehouse that contains all the data they need, business analysts must function as one-man IT shops where they spend an inordinate amount of time iterating between collecting, integrating, and analyzing data. They run into trouble when they distribute their hand-crafted data sets broadly.

Data Warehouse. The most important way that organizations can improve the productivity and effectiveness of business analysts is to maintain a robust data warehousing environment that contains most of the data that analysts need to perform their work. This can take many years. In a fast-moving market where the company adds new products and features continuously, the data warehouse may never catch up. But, nonetheless, it's important for organizations to continuously add new subject areas to the data warehouse, otherwise business analysts have to spend hours or days gathering and integrating this data themselves.

Atomic Data. The data warehouse also needs to house atomic data, or data at the lowest level of transactional detail, not summary data. Analysts generally want the raw data because they can repurpose in many different ways depending on the nature of the business questions they're addressing. This is the reason that highly skilled analysts like to access data directly from source systems or a data warehouse staging area. At the same time, less skilled analysts appreciate the heavy lifting done by the IT group to clean and integrate disparate data sets using common metrics, dimensions, and attributes. This base level of data standardization expedites their work.

Once a BI team integrates a sufficient number of subject areas in a data warehouse at an atomic level of data, business analysts can have a field day. Instead of downloading data to an analytical workstation, which limits the amount of data they can analyze and process, they can now run calculations and models against the entire data warehouse using analytical functions built into the database or that they've created using database development toolkits. This improves the accuracy of their analyses and models and saves them considerable time.


The technical side of analytics is daunting. There are many moving parts that all have to work synergistically together. However, the most important part of the technical equation is the data. The old adage holds true: "garbage in, garbage out." Analysts can't deliver accurate insights if they don't have access to good quality data. And it's a waste of their time to spend days trying to prepare the data for analysis. A good analytics program is built on a solid data warehousing foundation that embeds analytical sandboxes tailored to the requirements of individual analysts.

Posted November 15, 2011 7:44 AM
Permalink | No Comments |

Advanced analytics promises to unlock hidden potential in organizational data. If that's the case, why have so few organizations embraced advanced analytics in a serious way? Most organizations have dabbled with advanced analytics, but outside of credit card companies, online retailers, and government intelligence agencies, few have invested sufficient resources to turn analytics into a core competency.

Advanced analytics refers to the use of machine learning algorithms to unearth patterns and relationships in large volumes of complex data. It's best applied to overcome various resource constraints (e.g., time, money, labor) where the output justifies the investment of time and money. (See "What is Analytics and Why Should You Care?" and "Advanced Analytics: Where Do You Start?")

Once an organization decides to invest in advanced analytics, it faces many challenges. To succeed with advanced analytics, organizations must have the right culture, people, organization, architecture, and data. (See figure 1.) This is a tall task. This article examines the "soft stuff" required to implement analytics--the culture, people, and organization--the first three dimensions of the analytical framework in figure 1. A subsequent article examines the "hard stuff"--the architecture, tools, and data.

Figure 1. Framework for Implementing Advanced Analytics
Part III - Implementation Challenges.jpg

The Right Culture

Culture refers to the rules--both written and unwritten--for how things get done in an organization. These rules emanate from two places: 1) the words and actions of top executives and 2) organizational inertia and behavioral norms of middle management and their subordinates (i.e., "the way we've always done it.") Analytics, like any new information technology, requires executives and middle managers to make conscious choices about how work gets done.

Executives. For advanced analytics to succeed, top executives must first establish a fact-based decision making culture and then adhere to it themselves. Executives must consciously change the way they make decisions. Rather than rely on gut feel alone, executives must make decisions based on facts or intuition validated by data. They must designate authorized data sources for decision making and establish common metrics for measuring performance. They must also hold individuals accountable for outcomes at all levels of the organization.

Executives also need to evangelize the value and importance of fact-based decision making and the need for a performance-driven culture. They need to recruit like-minded executives and continuously reinforce the message that the organization "runs on data." Most importantly, they not only must "talk the talk," they must "walk the walk." They need to hold themselves accountable for performance outcomes and use certifiable information sources, not resort to their trusted analyst to deliver the data view they desire. Executives who don't follow their own rules send a cultural signal that this analytics fad will pass and so it's "business as usual."

Managers and Organizational Inertia. Mid-level managers often pose the biggest obstacles to implementing new information technologies because their authority and influence stems from their ability to control the flow of information, both up and down organizational ladders. Mid-level managers have to buy into new ways of capturing and using information for the program to succeed. If they don't, they, too, will send the wrong signals to lower level workers. To overcome organizational inertia, executives need to establish new incentives for mid-level managers and hold them accountable for performance metrics aligned with strategic goals around the decision making and the use of information.

The Right People

It's impossible to do advanced analytics without analysts. That's obvious. But hiring the right analysts and creating an environment for them to thrive is not easy.
Analysts are a rare breed. They are critical thinkers who need to understand a business process inside and out and the data that supports it. They also must be computer-literate and know how to use various data access, analysis, and presentation tools to do their jobs. Compared to other employees, they are generally more passionate about what they do, more committed to the success of the organization, more curious about how things work, and more eager to tackle new challenges.

But not all analysts do the same kind of work, and it's important to know the differences. There are four major types of analysts:

  • Super Users. These are tech-savvy business users who gravitate to reporting and analysis tools deployed by the business intelligence (BI) team. These analysts quickly become the "go to" people in each department to get an ad hoc report or dashboard, if you don't want to wait for the BI team. While super users don't normally do advanced analytics, they play an important role because they offload ad hoc reporting requirements from more skilled analysts.

  • Business Analysts. These are Excel jockeys that executives and managers answer to create and evaluate plans, crunch numbers, and generally answer any question an executive or manager might have that can't be addressed by a standard report or dashboard. With training, they can also create analytical models.

  • Analytical Modelers. These analysts have formal training in statistics and a data mining workbench, such as those from IBM (i.e., SPSS) or SAS. They build descriptive and predictive models that are the heart and soul of advanced analytics.

  • Data Scientists. These analysts specialize in analyzing unstructured data, such as Web traffic and social media. They write Java and other programs to run against Hadoop and NoSQL databases and know how to write efficient MapReduce jobs that run in "big data" environments.

Where You Find Them. Most organizations struggle to find skilled analysts. Many super users and business analysts are self-taught Excel jockeys, essentially tech-savvy business people who aren't afraid to learn new software tools to do their jobs. Many business school graduates fill this role, often as a stepping stone to management positions. Conversely, a few business-savvy technologists can grow into this role, including data analysts and report developers who have a proclivity toward business and working with business people.

Analytical modelers and data scientists require more training and skills. These analysts generally have a background in statistics or number crunching. Statisticians with business knowledge or social scientists with computer skills tend to excel in these roles. Given advances in data mining workbenches, it's not critical that analytical modelers know how to write SQL or code in C, as in the past. However, data scientists aren't so lucky. Since Hadoop is an early stage technology, data scientists need to know the basics of parallel processing and how to write Java and other programs in MapReduce. As such, they are in high demand right now.

The Right Organization

Business analysts play a key role in any advanced analytics initiative. Given the skills required to build predictive models, analysts are not cheap to hire or easy to retain. Thus, building the right analytical organization is key to attracting and retaining skilled analysts.

Today, most analysts are hired by department heads (e.g., finance, marketing, sales, or operations) and labor away in isolation at the departmental level. Unless given enough new challenges and opportunities for advancement, analysts are easy targets for recruiters.

Analytics Center of Excellence. The best way to attract and retain analysts is to create an Analytics Center of Excellence. This is a corporate group that oversees and manages all business analysts in an organization. The Center of Excellence provides a sense of community among analysts and enables them to regularly exchange ideas and knowledge. The Center also provides a career path for analysts so they are less tempted to look elsewhere to advance their careers. Finally, the Center pairs new analysts with veterans who can give them the mentoring and training they need to excel in their new position.

The key with an Analytics Center of Excellence is to balance central management with process expertise. Nearly all analysts should be embedded in departments and work side by side with business people on a daily basis. This enables analysts to learn business processes and data at a granular level while immersing the business in analytical techniques and approaches. At the same time, the analyst needs to work closely with other analysts in the organization to reinforce the notion that they are part of a larger analytical community.

The best way to accommodate these twin needs is by creating a matrixed analytical team. Analysts should report directly to department heads and indirectly to a corporate director of analytics or vice versa. In either case, the analyst should physically reside in his assigned department most or all days of the week, while participating in daily "stand up" meetings with other analysts so they can share ideas and issues as well as regular off-site meetings to build camaraderie and develop plans. The corporate director of analytics needs to work closely with department heads to balance local and enterprise analytical requirements.


Advanced analytics is a technical discipline. Yet, some of the keys to its success involve non-technical facets, such as culture, people, and organization. For an analytics initiative to thrive in an organization, executives must create a fact-based decision making culture, hire the right people, and create an analytics center of excellence that attracts, trains, and retains skilled analysts.

Posted November 7, 2011 9:45 AM
Permalink | No Comments |
PREV 1 2 3

Search this blog
Categories ›
Archives ›
Recent Entries ›