Blog: Wayne Eckerson Subscribe to this blog's RSS feed!

Wayne Eckerson

Welcome to Wayne's World, my blog that illuminates the latest thinking about how to deliver insights from business data and celebrates out-of-the-box thinkers and doers in the business intelligence (BI), performance management and data warehousing (DW) fields. Tune in here if you want to keep abreast of the latest trends, techniques, and technologies in this dynamic industry.

About the author >

Wayne has been a thought leader in the business intelligence field since the early 1990s. He has conducted numerous research studies and is a noted speaker, blogger, and consultant. He is the author of two widely read books: Performance Dashboards: Measuring, Monitoring, and Managing Your Business (2005, 2010) and The Secrets of Analytical Leaders: Insights from Information Insiders (2012).

Wayne is currently director of BI Leadership Research, an education and research service run by TechTarget that provides objective, vendor neutral content to business intelligence (BI) professionals worldwide. Wayne’s consulting company, BI Leader Consulting, provides strategic planning, architectural reviews, internal workshops, and long-term mentoring to both user and vendor organizations. For many years, Wayne served as director of education and research at The Data Warehousing Institute (TDWI) where he oversaw the company’s content and training programs and chaired its BI Executive Summit. He can be reached by email at

Stream processing has idled on the backwaters of the analytic market for years. But with the advent of Hadoop and new open source streaming tools, such as Storm, Spark, and Kafka, many companies are taking a closer look. And many stream processing tools are finally finding a home with the Internet of Things, in which consumer and commercial devices--from smartphones and household appliances to automobiles, utility meters, and medical equipment--emit millions of events per second and require specialized analytical systems to process them in real time.

Stream processing platforms, like SQLstream, provide both the horsepower and smarts to filter, aggregate, group, compare, and analyze large volumes of data in flight as well as visualize the results in real time. Telecommunications companies use SQLstream to monitor network performance, track service usage, and detect fraud in real time; oil and gas producers use it to monitor operations of drilling rigs, digital wells, and intelligent oil fields; and transportation companies use it to monitor traffic congestion, among many other things.

Compared to Storm and Spark, SQLstream is a complete enterprise platform for streaming analytics that can be deployed quickly without a large development effort. Whereas the open source projects are free to download, they require a lot of development talent and time to make work, especially in high-volume environments. Moreover, SQLstream, which gets its name because it uses continuous SQL to generate analytics, runs more efficiently, requiring many fewer servers and less overall expenditures.

As devices become more intelligent with the addition of sensors, product companies will need to invest in stream processing systems to make sense of the deluge of data. SQLstreams is well positioned to capitalize on the emerging Internet of Things.

For more information, see

Posted September 16, 2014 11:22 AM
Permalink | 1 Comment |

Sometimes the best things come in small packages. If you want to create reports and dashboards directly from source data, then you might want to consider Entrinsik, a small business intelligence (BI) vendor with a venerable pedigree and highly satisfied customers.

Founded in 1984, Entrinsik got pulled into BI in 2002 when it built a Web reporting tool for customers using MultiValue databases, such as UniVerse and UniData, which store attributes as strings in the same table as the entity. This denormalized structure makes it difficult to generate reports quickly or easily. But Entrinsik cracked the MultiValue database nut and quickly became the go-to-player for MultiValue reporting tools. Since then, Entrinsik has expanded into the SQL market. Its Informer product queries both SQL and MultiValue databases.

As an operational BI product, Entrinsik complements traditional BI tools that primarily query data warehouses or other predefined dimensional or analytical structures. In many cases, Entrinsik becomes a customer's defacto BI tool because it is quick to set up, easy to use, priced affordably, and comes with superior customer service and support. Entrinsik is used heavily by community colleges and mid-size universities, as well as insurance and manufacturing companies.

Like most other BI tools, Entrinsik customers create a metadata layer, which requires about 2-3 weeks of effort. Once that is built, users can then build ad hoc reports and dashboards by dragging and dropping objects onto a canvass and configuring them accordingly. Entrinsik also lets customers, many of whom are value-added resellers, customize the product via its Java plug-in architecture.

For more information, see

Posted September 1, 2014 12:05 PM
Permalink | No Comments |

In the 1980s, before the dawn of data warehousing, companies ran reports and queries directly against operational systems. But this analytical activity undermined the performance of core systems and created a rallying cry for a dedicated reporting and analysis system, known as the data warehouse.

But what if you could have your operational cake and analytics, too? That's the fundamental question that Cyberscience asked back in 1977 when it was founded. And while the industry chose data warehousing as the architectural solution to manage reporting and analytical workloads, Cyberscience went against the grain. It kept optimizing the federated query technology of its Cyberquery product to run against operational applications and databases without degrading performance of either.

Fast forward 40 years and what's old is new again. As data warehouses teeter in the architectural abyss and hardware performance skyrockets, Cyberscience now finds itself on the cutting edge. By combining operational and analytical workloads, Cyberscience provides a compelling alternative to both data warehouses and operational data stores. With more than 5,000 customers and a top-rated customer satisfaction rating from BARC, a German research firm, Cyberscience is a long-time business intelligence (BI) vendor that has focused more on engineering and customer value than marketshare or mindshare.

The key to Cyberquery's success is a data dictionary that abstracts back-end resources and integrates with an optimized runtime engine that uses native connectors to speed access to most applications and databases. The dictionary automatically grabs metadata from the source databases and populates fields with plain English labels. Customers further customize the metadata layer to meet business requirements and enhance data security.

Driving Cyberquery's query engine is a fourth generation report specification language (much like Information Builder's WebFocus language) that makes it easy for customers to build any type of report and customize the look and feel. More than half of Cyberscience's annual $15 million in revenue comes from independent software vendors who embed the product in their own applications or white label it as their own. And despite its vintage, Cyberquery offers modern charting components and a fairly contemporary look and feel.

For more information, see

Posted September 1, 2014 11:51 AM
Permalink | No Comments |

For many years, Logi Analytics carved out a profitable niche selling an easy-to-use report and dashboard development platform to information technology (IT) professionals. But its recent foray into analytics--with Logi Vision, a visual discovery tool that first shipped in January--targets business analysts.

Staying true to its commitment to ease of use, Logi Vision brings a fresh new perspective to analysis by making heavy use of heuristics. The tool makes it easy for business analysts to shape, analyze, and visualize data without IT assistance. For instance, the tool automatically identifies data types, concatenates related fields into a single data object (e.g. city, region, and country into location), rearranges columns based on relevance ratings, and suggests visualizations based on the contours of the data, among other things.

Besides its use of heuristics, Logi Vision supports other features that distinguish it from the growing crowd of visual discovery products. For one, Logi Vision offers unique collaboration features that are tailored to business analysts, including search-based discovery, favorites, and rankings. It also makes it easy for analysts to follow each other's work via a customizable shared workspaces and activity streams (think Facebook news feed).

In addition, Logi Vision offers sophisticated binning capabilities that enable analysts to create custom groups from numeric and date values. It also sports a home-grown mapping engine that enables users to view maps while disconnected from the internet. Finally, Logi Vision integrates components from Logi Info, the company's flagship BI development environment.

Logi Analytics shipped Vision 1.3 this week. For more information, go to

Posted August 26, 2014 10:00 AM
Permalink | No Comments |

If you're a big data trailblazer, one way to monetize your risk is to offer your services to those following in your footsteps. This is exactly what Sears Holding Corporation has done. In 2012, Sears Holdings formed MetaScale to offer big data consulting and managed services to large companies in any industry, although it has deep expertise in retail.

MetaScale runs a Big Data Center of Excellence with big data experts who help companies install, implement, and manage Hadoop, NoSQL, and other big data tools and systems. Unlike big data vendors that also offer consulting and managed services, MetaScale is vendor neutral. It can piece together components from multiple vendors and Apache projects into an optimal environment based on a client's unique requirements. It also helps customers conduct proofs of concept, customize a solution to their needs, and manage production environments on a 7x24 basis, either at the customer's site or MetaScale's own hosting center.

MetaScale has helped companies build big data solutions to support real-time inventory tracking, customer segmentation, 360-degree customer views, personalized product offers, network analytics, and pricing optimization. One specialty, for which it has several patents pending, is migrating mainframe data and processing to Hadoop.
For more information, see

Posted August 23, 2014 6:24 AM
Permalink | No Comments |
PREV 1 2 3 4

Search this blog
Categories ›
Archives ›
Recent Entries ›