<-- Back to full color view

Today’s “Analytic Applications” – Misnamed and Mistargeted Analytic Business Applications Will Rule in 2015

Originally published April 21, 2009

To this observer, “analytic applications” ought to be defined as those which use analysis to deliver business functionality. Analytics applications deliver analysis. They do so in a variety of forms: reports, dashboards, visualizations of various kinds. They may be interactive or not. They may be on fixed or mobile platforms. They may be scheduled, ad hoc or deliver by subscription. But the point is this: what they deliver is analysis (or “analytics” in our often confused industry’s parlance).

For “packaged” (i.e., substantially pre-built, “off the rack”) business software, the promise of the future is truly analytic business applications, software packages that execute automated business processes with and/or without human intervention, based on policies, rules and real-time analytic results. These applications will source data from a panoply of inputs, from databases and file systems to unstructured content, human data entry to transaction processing devices like “cash registers” (quaint notion, that), environmental probes like monitoring devices in oil wells and hospitals to searches of cyberspace. Some of the processes will write to their own persistent stores (databases) and some will bestride ongoing streams of information (events). The applications will use analysis of historical and incoming data within a rich business context that includes history, policy, industry norms and standards, regulatory and compliance issues, and other information. The embedded analysis will shape a variable sequence of processing steps, within defined boundaries, and raise exceptions when those boundaries are broken.

To understand the marketplace impacts of this transition, which is already underway, we need to look at a few key market vectors:

  • The metamorphoses of business intelligence (BI) firms and their portfolios to add applications;

  • The evolution of packaged applications firms into finer grained business processes with added context (and acquired BI); and

  • The entry of at least one very competent mega-vendor who has hitherto stayed out of the fray.
To begin with, we start with why business intelligence has perpetually fallen short – but still grows at steady rates, in good times and bad.

Business Intelligence Looks For Users – “There’s No Success like Failure…”

Bob Dylan went on to say “…and failure’s no success at all.” He could have been talking about business intelligence. The BI community has perpetually promised to put tools into the hands of business users who would do their own analysis. And after decades, today’s vendors are still trying to do so. They still succeed (the market grows) … and fail (it’s still the power users) every year. More (usually redundant) product is sold every year into organizations that are trying to get meaningful analysis of their business to help make better decisions. What Wayne Eckerson of The Data Warehousing Institute (TDWI) calls “spreadmarts” have sprung up, often beginning with (frighteningly ungoverned) spreadsheets but growing to encompass data stores and tools designed to provide a context-specific analytic environment for business use.

Think of this another way. Imagine every BI demo you’ve ever seen – the story is always the same. You examine the results of one or more business processes, find something wrong, and watch as a variety of drilldowns, visualizations and displays close in on what’s making that KPI [wrong/red/too low/too high]. At the end, the presenter triumphantly announces that “now we can [update the forecast/call the warehouse and have some widgets shipped/change the terms for that supplier].” Success! We have, once again, perfectly predicted the past.

Three things are wrong here:

BI tools don’t provide business context out of the box. They don’t provide guidance on a good baseline for profit margin in your industry, or what an acceptable (or unacceptable) shipping schedule is, or what constitutes a suspicious pattern of activity in a trading account. Demos never show you what it took to understand context – those business drivers, constraints and goals – because that is “on the other side of the glass,” in the head of the user. And rarely is it stored in some reusable, portable fashion, even implicitly.

Even if you understand the internal business, fixed solutions don’t adapt. Early attempts at contextual definitions – “universes,” pre-designed sets of data, measurements, displays, reports and guidelines – froze analysis in place. Truly out-of-the-box thinking still must be translated for specialists who have mastered the tools. The specialists build the new data stores, design the new feeders, scrubbers and translators, build new display metaphors – and buy new tools.

We’re looking for business results, but being sold tools. Home Depot doesn’t display aisles of improved homes – parts of rooms, perhaps. Buyers know that a few pipes and a sink don’t make a bathroom. Without context and training, they leave most tools to be used by craftsmen/contractors. Contractors make money because most average Joes have neither the time nor interest to master the tools. But BI demos imply that a pivot table or a scorecard is a business result, even though none of the tools drive actions based on their analysis.

In the face of these challenges, BI tool vendors have innovated at a furious pace. They have added new, separately priced offerings (if you can’t monetize it, it’s hard to fund development) by internal development or acquisition: text analytics, advanced visualization, data mining. More tools for specialists. They have created new revenue streams with these extensions of existing analytical thinking and offered new insights. But it’s not enough.

So, for the last few years, the larger remaining independent BI vendors have begun to build applications (often called “solutions”). For example, IT Market Strategy believes that independent BI market leader SAS, which passed the $2B mark in 2007, now gets 60-65% of its product revenue from things other than “tools.” Few other players can aspire to their reach, but many are pursuing similar approaches in market niches.

Packaged Apps Vendors – “See a World in a Grain of Sand”

William Blake’s Auguries of Innocence was not about business applications – he was never much of a businessman and computers were over a century away. But to stretch his “grain of sand” image a bit, early packaged applications were so “coarse grained” they were more like beaches, and fit in as much of the world as they could. Consider one of the most successful: SAP ERP. Its ambition was huge: coordinate the resources, information, and activities needed to run virtually the entire business. Within those massive monolithic installations were buried more “fine grained” processes such as order fulfillment or billing. The intent was to leverage synergies, eliminate redundancies, and solve all problems in one fell swoop. It cost millions – and that was just the start, because everything needed to be customized, and at much greater cost than the software itself.

As time went on, other “large grained” business processes got their own massive packages: CRM, HR, and others. Sometimes, some of the beach was shared with ERP – but the subdivision (granularity) didn’t seem to reduce the complexity much, or the cost of customization. Finer grained ones emerged, like "supply chain management." And still, we needed to get even finer and deeper, as it became evident that, like BI products, business process products need context.

So we got finer still, and then wrapped the grains with pearl-like wisdom about industry issues, resulting in more purpose-built products like, for example, "logistics management for hazardous materials delivery in the chemicals industry." Each offering got closer to a specific business process and added the dimension of specific industry business process knowledge (presumably cognizant of regulatory and other constraints unique to that industry). This “verticalization” enabled apps vendors to enter more markets with specialized offerings, while reusing the software stacks – “platforms” – they had built, and in the process they began to codify the intellectual property (IP) they developed that provided the industry context.

And still, that was not enough. To get to the new smarter business apps of the future, analytics needed be embedded. So Oracle and SAP both grew business intelligence within their portfolios over time – but it wasn’t enough. Ultimately, Oracle did something it had done before with its Hyperion acquisition, and SAP broke its typical pattern dramatically when it acquired Business Objects – two leading BI players added to raise their games for the next stage or market development. And now we await their “smarter” applications.

“Putting It Together” – IBM Does Sondheim

For years, IBM’s software business has grown steadily. Recently it has outpaced the leading hardware revenue stream the giant IT vendor built. And Senior Vice President and group executive Steve Mills has run that business like painter Georges Seurat in Sondheim’s “Sunday in the Park With George,” filling in a rich complex scene with tiny dots (and occasionally bigger ones). He has steadily built IBM’s portfolio from databases, middleware, systems management, developer tools and user environments, clients and servers, front office and back office, until IBM had a leading or competitive product across virtually the entire stack needed to compose almost any solutions.

Through those years, if asked about its intentions in the packaged applications business, from which it was largely absent, IBM’s response was always the same: “We’re not going there. We make a great deal of money selling products to and with our applications partners like SAP and we won’t compete with them.” It was true, and continues to be – IBM has a partner program with few peers, and invests enormously in supporting partners large and small – even funding their entry into markets.

But another strategic move for IBM came outside of the software and hardware sectors. Along the way, IBM made strategic investments and commitments to the services business – and now it dwarfs most other firms in that space. And it has applied its skills to codifying and reusing the IP it develops in customer engagements, just as it has built reusability and portability into its software componentry. That formidable asset is now at the disposal of IBM’s field army and partners, to be composed together with all the other portfolio assets from the software and hardware teams to create custom-built applications that begin their lives much closer to off-the-rack than bespoke.

One more element was needed. Like its apps competitors, who after all were also selling databases (Oracle, among other things), and middleware (SAP, although not very successfully), IBM went big and acquired a leader in business intelligence. With Cognos under its belt and being integrated into the portfolio – a skill IBM has been honing under Mills as it has rationalized a huge collection of sometimes overlapping products – new opportunities present themselves.

IBM has assembled the components to compete with the application package vendors with added business intelligence, and with the BI vendors building applications, for the new analytic business applications. And it has been quite clear that it understands the shape of the business applications of the future, and plans to supply them to its customers. The battle will be fierce, fought on many fronts, and lengthy. All the major players bring formidable portfolios, sales and services organizations, and deep pockets.

Who Will Win?

Ultimately, the contest is likely to be fought on economics. Few micro-sectors are large enough to support a complex, globally distributed and supported solution that is highly specific – there simply are not enough customers for each individual solution. But when many of the components are pulled from a sizable inventory and assembled with some (paid) services added, scale (applied effectively) wins.

The platform wars of the past decade have reduced the number of meaningful software stacks to a handful: IBM, Microsoft, Oracle, perhaps SAP. The stack owners are positioning themselves to build on their advantage, adding the business process, industry expertise and services from captive organizations. Their ability to work through partners will also be significant in projecting themselves into emerging markets. Some remaining independents like SAS or Lawson will rise or fall on their ability to reuse, and add value above, these stacks.

There is a wild card: open source software components, cloud infrastructure, and publicly available IP. Social computing and shared, open source development projects could elevate this final scenario dramatically in the next 5 years. Can anyone else rise from the pack to challenge the behemoths? At this point, it appears unlikely; achieving adequate scale will be difficult. Rising stars may appear in sectors (processes and industries) the big players don’t prioritize, and they will be acquired as soon as they prove their value. Still, if history is any guide, someone will surprise us. Fasten your seatbelt; it’s going to be a bumpy ride.

SOURCE: Today’s “Analytic Applications” – Misnamed and Mistargeted

  • Merv AdrianMerv Adrian

    Merv, now a Vice President at Gartner, was Principal at IT Market Strategy when this was written. He has spent 3 decades in the information technology industry. As Senior Vice President at Forrester Research, he was responsible for all of Forrester’s technology research for several years, before returning to his roots as an analyst covering the software industry and launching Forrester’s well-regarded practice in Analyst Relations. Prior to his Forrester role, Merv was Vice President and Research Manager with responsibility for the West Coast staff at Giga Information Group. Merv focused on facilitating collaborative research among analysts, and served as executive editor of the monthly Research Digest and weekly GigaFlash. He chaired the GigaWorld conference (and later Forrester IT Forum) for several years, and led the jam band, a popular part of those events, as a guitarist and singer.

    Prior to becoming a technology analyst, Merv was Senior Director, Strategic Marketing at Sybase, where he also worked as director of marketing for data warehousing and director of analyst relations. Prior to Sybase, Merv served as a marketing manager at Information Builders, where he founded and edited a technical journal and a marketing quarterly, subsequently becoming involved in corporate and product marketing and launching a formal AR role.

    Before entering the IT industry, Merv spent a decade building systems in the securities, banking and transportation industries in New York, including several years as a manager of end user computing at Shearson Lehman Brothers and a stint as a statistical analyst at the Federal Reserve Bank of New York. His early analysis of the micro-to-mainframe market and its impact on decision support,The Workstation Data Link, was published by McGraw-Hill in 1988.

    Merv was a member of the Advisory Board of the International Data Warehouse Association in its formative years, and served as editor of the NY PC User Group Newsletter in the mid-‘80s. He holds a B.S. in business administration (finance) from CUNY’s Baruch College.

Recent articles by Merv Adrian

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!

 

Copyright 2004 — 2019. Powell Media, LLC. All rights reserved.
BeyeNETWORK™ is a trademark of Powell Media, LLC