We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: Merv Adrian Subscribe to this blog's RSS feed!

Not Pictured

Welcome to my BeyeNETWORK blog! Please join me often to share your thoughts and observations on new analytic platforms, BI and data management. I maintain a vendor-focused practice that uses primary research, briefings, case studies, events and other activities that stimulate ideas as a source for commentary on strategy and execution in the marketplace. I believe the emergence of a new class of analytic platforms, and emerging data management and advanced tools herald a next step in the maturity of information technology, and I'm excited to be present for its emergence. I hope my blog entries will stimulate ideas that will serve both the vendors creating these new solutions and the companies that will improve their business prospects as a result of applying them. Please share your thoughts and input on the topics.

 

 

August 2009 Archives

Perhaps I should have called this piece "Blogger Eats Words." Hewlett-Packard has landed (pun intended) precisely the kind of strategic partnership win I recently suggested it is not positioned for, based on its recent description of its portfolio in a quarterly earnings call. The victory comes exactly where I suggested it needed to: with a services-led approach, leveraging the formidable assets of EDS. In an-industry-shaking coup, HP has landed a contract to replace Sabre as the proverbial "airline reservation system" - traditionally, a synonym for "really hard IT stuff" - for American Airlines (more precisely, AMR, the parent company).

It gets better. As Forrester's Henry Harteveldt points out, "Sabre Holdings is not only the one of the largest travel technology firms, but one of the largest private-sector transaction processing systems in the world, regardless of industry." Read Henry's piece for an excellent look at many of the details and implications.

Sabre was spun out of American itself; nobody knows its value, its limitations, and its architecture better. Except perhaps EDS, who signed a 10 year deal with Sabre in 2001 to outsource its IT. American has taken a bold step, in the midst of difficult economic times for its industry, to mount a reworking of this core process. Strategic? You bet. Partner? It could hardly be more so. It is intriguing, though, to see HP bidding against a firm for which it provides IT services, and that sort of increased coopetition will become more widespread in an era where firms outsource more and more core processing.

Services firms like EDS and its competitors have traditionally described themselves, especially to the investment and industry analyst communities, in terms of such wins. Learning how to leverage these stories in its corporate messaging will serve HP well as it asserts its strategic value. (It's worth noting that Larry Bissinger, who handles messaging to those communities - and more - for EDS, is one of the folks who have remained with the team after the acquisition. A good move by HP there.)

Linkage to HP's portfolio, if such linkage can be engineered into similar deals, can be used to validate the cross-portfolio synergies. No doubt there will a great need for, say, PCs, printers, servers, cloud infrastructure, systems management, telecommunications assets, and more in this deal - all of which are strengths HP has in abundance. And the prominent mention of open source componentry may hint at a direction in HP's messaging that could be a foil to other firms' portfolios of expensive software. What about BI? Not so much talk there, but EDS was not the only services acquisition HP has made - Knightsbridge, a firm with a great pedigree in a variey of BI-related areas, became part of HP over 2 years ago. Perhaps some similar messaging based on that would add some luster to that part of HP's portfolio as well, for this deal and others.

So kudos, HP, and I must say crow doesn't taste quite as bad as I feared it would. I hope that next time your timing will be a hair better - an anouncement like this with your earnings call would have made a great addition to the atmosphere.


Posted August 31, 2009 5:04 PM
Permalink | No Comments |

It's been a while since Oracle made the series of acquisitions that redrew the map on applications software, and they have been fairly successful there. The broadening of the portfolio created considerable challenges for the rationalization of Oracle's BI strategy, and I recently had the opportunity to sit down with Paul Rodwick and Bill Guilmart, VPs of Product Management, to catch up on the Enterprise Performance Management (EPM) story so far. We analysts are quick to criticize the pace of integration, the level of detail, and the timing of the roadmap from companies with enormous portfolios like Oracle's. Personally, I'm glad I don't have to live every day with the consequences of my brilliant ideas about how to rationalize all those moving parts. (Remember those ads? "We don't do. We just advise.") Paul and Bill must live with theirs, and I was impressed with the clarity and consistency of the model they described to me. It's a good story, with emerging successes in abundance, and the best may be yet to come.

When Oracle acquired Hyperion, the "Kennedy" release was close to delivery, and ultimately it arrived as EPM System 11 in July 2008. Later this year Oracle will release its Fusion edition of the BI family, and this is no small thing - over 40 products have already arrived in the Fusion Middleware 11g wave. Gartner has called it a "milestone" and agrees that it achieved its goals of being "complete, integrated, hot-pluggable, best of breed." Participating in this well regarded family, which includes messaging, portal, business rules, CEP, BPEL process management and many other key components, creates an environment that will support and extend EPM reach in what will be called [deep breath] Oracle Enterprise Performance Management System, Fusion Edition Release 11.1.1. And further releases are in the pipeline for 2010 that will advance the ball considerably.

[Oracle] Essbase, the BI Server from Siebel, and a variety of predictive analytics pieces including the former Sigma Dynamics (now Real Time Decisions [RTD]) now make up the Foundation layer - which is among the soon-to-be refreshed elements of the portfolio and is to connect to "all relevant data sources" - including competing ones. Above that layer are two sets of applications, which see the content through a Common Enterprise Information Model. The first set, performance management (strategy management, business planning, profitability management, and of course financial reporting and compliance) is complemented by a growing set of BI applications. The latter include CRM and ERP analytics in 12 categories, and it gets more granular from there.

The devil's in the details, and there are pieces here and there that aren't working perfectly yet, I know - users occasionally explain to me why one that isn't ready yet is a showstopper. But Oracle has been investing and delivering consistently to a clean and compelling model, especially when one steps up to the layer above, where an increasingly consistent set of UIs provides ad hoc analysis, reporting, role-based interactive dashboards, proactive alerts, and Office and Excel integration. Oracle Business Intelligence Suite Enterprise Edition (OBIEE, and did we ever need an acronym more?) had releases last August and in March of this year that increased the integration, and the way forward is clear.

Where the value gets delivered most rapidlyto customers , and Oracle's opportunities for a growing revenue expand, will increasingly be in BI applications. And Oracle has just delivered three new ones: Hyperion Profitability and Cost Management, Oracle Loyalty Analytics, and Oracle Project Analytics. Looking at where these BI application offerings are available to the different Oracle enterprise application customer sets, there's still a great deal of white space on the matrix - for example, JD Edwards customers are still waiting for most of the apps to show up for them, although Financial Analytics arrived recently.

Oracle shared some customer data, albeit under NDA. The news appears good, although a grain or two of salt must be taken until the cone of silence is removed and we can talk to more customers. Oracle asserts that migrations to the current version of EPM are complete for a sizable percentage of the customer sites. I believe that geographical distribution is good across NA and Europe, with APAC lagging somewhat (although momentum in Japan for BI seems good). Larger deals are a sizable percentage, and competitive replacements (mostly of SAP) are healthy. Sector distribution is also good, and the addition of dedicated sales resources is clearly having an impact. A progressively more experienced sales force is becoming more and more effective, and Oracle's recent efforts in training will be leveraged as the sales specialists are given more products to sell. Oracle is well known for rewarding success with investment (and higher quotas) and BI and EPM are no exception. The pace of events and their penetration into new geographies is continuing to grow as results demonstrate their value.

I came away from my discussions with the best understanding I have had for some time of the way the pieces fit. Oracle has done an excellent job rationalizing it all - which is a boon to the sales and marketing teams who have to take it to the marketplace, of course. More important, Oracle is making the investments of its customers work together better, and beginning to deliver on the promise of a large portfolio: that together, the pieces deliver more value to the users sooner. There's work to do - in production reporting, advanced visualization and mobile device support, for example. And Oracle needs to counter SAP Business Object's thrust in search as a BI front end. But if history is any guide, the checkbook will open and some surprises will be in view soon.


Posted August 27, 2009 9:42 AM
Permalink | No Comments |

With the August announcement of Vertica Analytic Database 3.5, Vertica is laying claim to leadership of the new ADBMS vendors. With its most recent numbers - several dozens of customers are now in production and the company expects to pass 100 this year - the assertion bears thinking about. Driving forward with an aggressive release strategy, Vertica is showing its maturity and increasing ability to challenge the old school leaders like Teradata and Netezza - but with a software-only strategy. This agility allowed it to offer early support for release 3.5 in quick succession after its last release, with GA scheduled for later this year. 

It was only a few months ago that Vertica announced version 3.0, with a large set of significant advances including SQL-99 functions, faster data load speeds, improved security including SSL client security encryption and LDAP/Kerberos/Active Directory integration and a variety of performance boosts. Keeping the pace up is a powerful way to trumpet leadership, and key announcements made at the TDWI event in San Diego included:

  • Flexstore- a refinement to the column store model adds column group storage, which can improve performance for frequently used pairs like bid/ask or columns with low numbers of unique values that can fit into Vertica's storage block size.  Vertica also now claims to optimize the placement of "hot" data Most used) into the fastest physical locations (best I/O), like  Netezza and Teradata. As this matures, Vertica expects to get into a rich ILM model with a hierarchy of storage speed from SSD to the slowest disks.
  • MapReduce support, but with a difference. Unlike Greenplum and Aster, who are bringing it into the database itself, Vertica is providing a streaming connection to Hadoop instances (the open source implementation of MapReduce; Vertica is contributing the adapter to the community). This architecture mirrors usage patterns we've seen, and which Vertica asserts its customers have told them they want. One scenario: use your ADBMS to retrieve stored data, pass it to Hadoop for analysis by staff with different skill sets from the typical ADBMS users, and then bring result sets back. A separate hardware for the Hadoop sandbox is fairly typical among early adopters today, and via a Cloudera partnership, Vertica can offer a deployment architecture that doesn't break the bank. Curt Monash does the usual excellent summary of Hadoop issues in his blog.
  • IPVS (IP Virtual Server)-based load balancing. For those of us not attuned to Linux kernel stuff: IPVS  implements transport-layer load balancing inside the Linux kernel, directing requests for TCP/UDP based services to the real servers, and makes services of the real servers to appear as a virtual service on a single IP address. Vertica uses round robin switching here; it must be turned on but then it's invisible.
  • Perl and Python support over ODBC. If you know what that means, you'll obviously be happy about it; if not, your programmers will be.
  • New verticals (no pun intended). Vertica is starting to get some traction in retail, which is new (and fertile) ground. As the company hits a $15M run rate or better, continuing growth will require getting into new markets. More broadly, marketing execution is visibly excellent - 40 referenceable customers are called out on Vertica's website already, an impressive total for a company claiming not much more than twice that many. Frequent, content-rich briefings for industry analysts. Presence at shows like TDWI - although it was not the best attended show, Vertica's Dave Menninger told me he was pleased with the leads they did get - and some competitors, notably Greenplum, were not present, leaving a less crowded field as Netezza, ParAccel and Sybase made bids of their own for attention, including announcements. A little applied revenue can make a big difference.   [corrected error - Aster was a sponsor and had a booth.]

Vertica continues to talk about pricing in terms of cost per terabyte. The market is still sorting various models out, but this is a useful one, because it allows one price: development, test and production all come in the same number. Costs are more predictable because change occurs in synch with data growth, not by jumps when newer hardware is needed - there are no new license fees for new nodes. And if you've ever suddenly found yourself in a different price category in processor-based pricing with other products, you'll appreciate this approach. In these times of explosive data growth, it certainly won't hurt Vertica's revenue stream; at the same time, it has an intuitive feel that makes sense to buyers.

I've said before that the next 18 months or so will be an exciting battle among an array of new ADBMS players. Vertica is claiming pole position, and they have excellent prospects in the battle ahead. Stay tuned.


Posted August 23, 2009 7:41 PM
Permalink | No Comments |

The land rush into the SaaS analytics space continues; Danish startup Youcalc is seeing solid results from its December 2008 commercial launch. Its value proposition: create custom analytics applications on live data from SaaS systems. Rasmus Madsen and Henrik Kjaer co-founded Youcalc with the idea that a community-based approach to creating analytics applications and sharing them in the SaaS world would unleash creativity within well-defined communities like salesforce.com's AppExchange, SugarCRM customers, and users of Google Analytics and Google Adwords. Joining BIrst, Cloud9, Gooddata, Pivotlink and others, Youcalc has made good progress with a 30-day free trial and minimal traditional marketing. The idea is that users will treat the product as a platform, creating and sharing a "vast library of ready-to-use, yet customizable analytics apps." Will this pared-down approach and community model help avoid the issues that led to the failure of Lucidera? We'll see.

There's a reason Youcalc doesn't have much marketing budget - it's a very early stage firm with no funding and only a few paying customers from the 3000 who had installed it at the end of the second quarter.  Rasmus Madsen tells me,

"Our marketing budget is currently 20 dollars per day - it's an ad word. Everything else is viral."

And revenue numbers will take some time to ramp: Youcalc only charges $19.95 per user per month. There are already over 130 applications for the products listed above as well as  Basecamp and Highrise. Youcalc targets small and medium firms whose budgets are hard to crack open; Madsen believes 20 seats is the sweetspot for SaaS in small to midsize businesses. Applications available so far offer analytics for  Sales tasks including forecasting, pipeline management,  and salesforce performance, not unlike the focus Lucidera had arrived at before it went under. But Youcalc is also targeting the Marketing organization with campaign ROI, e-mail marketing, Web analytics, and AdWords performance. And the Services side of the house is also a target with analytics for projects, tickets/incidents, and productivity.

Youcalc has a separate development environment where customers create and then deploy their apps by publishing them as a desktop application to the YouCalc server. It feels like a desktop app, but you're just seeing the UI. The app has a live connection to the products noted above, and more will be added. Embed them inside, say, salesforce - just click a tab, and you're off to the races. It's not "installed." Apps can also be added to iGoogle, and seen on iPhone. Youcalc works on a transient model: "slurp it into our memory, work with it, and it goes away when you're done, " Madsen says.  It's a real-time, not data warehouse-based approach. "We are not talking about terabytes of data yet," Madsen says. "CRM systems aren't that big yet, especially for the companies we're targeting." One benefit: everyone gets the newest version of an app even if they have their own copy of it.  

Youcalc has a demo account with salesforce.com, so there is data available to play with right away. If you are a salesforce customer already, you can be looking at your own data within minutes. For some kinds of analysis, more than one source is needed - for example Google Adwords would have its own data, while revenue information might be in salesforce. To measure ROI you'd need to mash them up. Youcalc claims they can do that, and to have two apps that do so already; I did not verify this. But "real data integration" is out of scope here; prospects will have to decide whether the limits are a stopper for them. David Raab does a nice job describing Youcalc in some more detail on his blog, and looks at some of the other constraints. There is also a brief product tour on Youcalc's site that provides a good sense of its features.

Today the developed apps (not the data, of course) will be shared with other users by default. The community features you see in YouTube and elsewhere are here - a rating system and tagging to help other users find useful apps. Madsen tells me that some clients are asking for models to be kept private, and Youcalc will introduce private workspaces. Still, the community proposition is clearly a key in Youcalc's thinking. The company is keeping investment conservative thus far to avoid melting down before it gets off the ground. Time will tell if this strategy and the reliance on community, will prove successful. But it's worth a look, and the price is low.


Posted August 17, 2009 9:25 AM
Permalink | No Comments |

Few events offer as much promise as The Data Warehouse Institute World Conferences. With a deep educational focus, TDWI provides important opportunities for users. For vendors, the event offers one of the most focused, serious prospect audiences possible. My expectations, tempered though they were by economic realities, were still fairly high for this year's San Diego event. Unfortunately, the drop in volume was greater than all of us expected, the number of announcements from the vendor community was low, and the content focus seemed a bit out of date.

Some of the observers I spoke to were particularly down about the attendance, calling it "devastating." Some sessions (even all day ones, typically a mainstay) were very thinly attended - single digits - and one wonders how the presenters felt about the size of their audiences. That said, my conversations with the attendees elicited very favorable comparisons to the Gartner BI event, which several people said doesn't offer enough actionable, hands-on content.

A real high point for me was Cindi Howson's "Developing Your BI Tool Strategy and BI Bake Off" session. Cindi's BI Scorecard site is a great resource for deep product evaluations, and the session didn't disappoint. Those who subscribe get a deep, hands-on look at feature and function, and in the TDWI session Cindi talked about a framework for assessing several aspects of BI offerings, as well as a way to weight the importance of various facets of what has become a bewildering rich array of details to consider. As someone with a good deal of experience using a similar method - the Forrester Wave - I was impressed by the richness and depth of the work.  A few quibbles I had with the content - such as the assertion that SAP "has no database" - paled by comparison to the overall richness and quality.

The second part of the session was the well-known "bakeoff." Three vendors are put through their paces responding to a specific set of scenarios and tasks, and the audience votes. IBM Cognos, Oracle and SAS were the participants for this go-round, and Cognos repeatedly came out on top as we looked at query, reporting, OLAP, architecture, dashboarding, administration and predictive analytics. Most features were not well suited to show SAS off, so it was no surprise that they did not do well until the PA part of the  exercise- which they dominated with100% of the votes. The presenter counts for a lot, and Oracle might have done better with 2 participants like its competitors, allowing a split between the technical and the marketing pitch that lets both focus on their job. Oracle's patter was a bit too glib and the demos a bit too thin, and their results thus a bit lower than I expected based on tha actual products, which did not show as well here as they could.

The exhibit hall floor looked sparsely populated to me. But my overall impression was somewhat more sanguine on the floor, buoyed in part by comments from executives in several of the vendor booths. "We were pleased," said Vertica's Dave Menninger. "The people we spoke to had real projects they expected to do soon. Lower quantity, but good quality." I heard similar sentiments from Kickfire, Netezza, and others. For these smaller players, TDWI is a very valuable opportunity for exposure and they make the most of it. And deals are done between vendors as well; several discussed partner conversations they started - or completed - here.

TDWI is a great opportunity for analysts to meet with vendors and get multiple briefings in a short time. I had excellent conversations with SAS, ParAccel, Lyzasoft, Kognito, and friends and colleagues like Claudia Imhoff, Mark Madsen, Neil Raden, John Myers, Ron Powell, Shawn Rogers and others. I'll blog further about some of the vendor conversations. I might have already done so, but I was slowed a bit by a disappointing press room, which had no phones, no connectivy, no power strips, few press releases, no beverages or snacks for the hard working scribes, and was even closed down on Thursday without notice, surprising at least one vendor/analyst pair when the occupants gruffly advised them to "shut the door and stop interrupting our meeting." If this is typical, it's no surprise there was so little participation by the large analyst firms in a week where there was little significant "competition."

On the 4th day, Dave Wells, formerly head of education for TDWI, delivered a morning "closing keynote" (although the event continued for another 2 days, the exhibit hall closes on the 3rd day.) Wells pointed out the the economic cycle lags market indicators like the stock market, and that the "uncertainty curve" is an inverse function. The questions now are "what if" questions, he pointed out, but organizations are using old metrics because they don't know what to change. Today, in this climate, he said,

"Even good decisions aren't  good enough. Yesterday's scorecards are  insufficient; we need nearer term data, influencers and scenarios. Move from goals-measures-monitor to model-simulate-feedback. I need more than insight; I need predictive analytics."

Wells was correct in all this, but two problems were apparent. First, he himself offered no suggestions about how to get there - vendors, first steps, priorities - all were missing. Second the content of the event, while it had some significant offerings pointing in this direction, did not raise this issue to the top. TDWI has a great opportunity to wave the flag here for the next steps in BI, but it is still focusing a great deal on the last set of issues. PA, semantic technologies (as Neil Raden pointed out in conversation) and truly active technologies are needed. Will TDWI step up and be the advocate? Time will tell; it's a mainstay and it can, and should, do better.


Posted August 10, 2009 5:39 PM
Permalink | 2 Comments |

When is an appliance not an appliance? When it's more. On July 28, IBM's Software Group and Systems and Technology Group (i.e., the hardware folks) hosted an analyst event to introduce the Smart Analytics System.The discussion began with a series of conversations about the value of "workload optimization," or the effective tuning of processors, storage, memory and network components with software used for information management.  Not controversial, but hardly news. IBM claims to be raising the bar, though, with the promise of a system that is already tuned, and attuned to the needs of its purchaser, at a level far beyond appliances that other vendors have delivered: appliances, if you will, not only predesigned for specific use cases, but customized for specific instances of those use cases. It's no accident that IBM never called the Smart Analytics System an "appliance." Extending the Smart brand here is a powerful move, and IBM appears poised to make good on its promise.

Curiously, the speakers provided no specifics about the offering(s) until the final 10 minutes of the scheduled time during Q&A, when a direct question from Doug Henschen at last elicited some detail from Arvind Krishna, VP Enterprise Information Products. (Krishna was on the podium because Ambuj Goyal, the GM of Information Management Software, who was supposed to be there, was conducting a session on IBM's acquisition of SPSS. M&A is tricky - you go with it when you go with, and the timing was an unfortunate bit of luck, as the coverage of the acquisition has overshadowed the event we were there for; spending $1.2B tends to get people's attention.)

Krishna said that  IBM will release the Analytic Server(s) for immediate availability in September. Although the specifications are not in the press materials, the presentations given out, or the first level or two of content on IBM's web site, the details appear to be as follows:

  • Hardware - apparently a Power System 550running AIX 6.1, with some memory (an unspecified amount) and IBM DS5300 storage arrays (again, amount unspecified.) Steve Mills, head of IBM SW Group, offered that the Analytic Server, available in 6 sizes, is "non-decomposable," which we take to mean in-place upgrades of memory or storage are not possible unless the customer moves to a different size (model) and it's not clear if it's possible to add to the largest size. 
  • Software - presumably DB2, InfoSphere Warehouse, and Cognos with database compression (likely the compression already shipping in DB2 9.7), analysis, dashboards and reporting. Cubing Services and OLAP were mentioned as well as Data Mining and Unstructured Information Analytics, some of which are options in the underlying product families. It's not clear which are options here. ETL was not mentioned so it's not clear if it is part of the package. Also discussed was Advanced Workload Management - which was mentioned in the context of consolidating warehouses and marts, though it's not clear if there is a facility for rearchitecting a new larger warehouse from existing ones or federating those in place through some sort of metadata layer. Application modules for verticals will also be available, once again leveraging the IP-as-an-asset model IBM SWG has been sharpening with its partners in the Services and Research organizations.
  • Services- these were unspecified, except to say that they will create an "analytics ready" delivery, and take the labor out of integration,  optimization, and data restructuring for performance. Beyond appliance like installation and pre-integration, this presumably involves customized work for each buyer to assess and optimize their data; no time or number of people involved were specified. It's not clear if this will be completely variable and priced accordingly or if some fixed number of person-hours comes with each size.
  • Network clustering from an unnamed partner was also mentioned; at this time, we have no details regarding this component.

Support for everything is included, as well as periodic "health checks" to ensure that software and hardware optimization remain up to date with changes in data composition, usage and distribution over time. More than an appliance? Unquestionably.

"This is not just bundling," Steve Mills said. "It's top to bottom understanding and optimization."

Fair enough; IBM has drawn a powerful line in the silicon. But as an announcement, the story falls short - model numbers, capacities, specific software components, and go-to-market details are conspicuously absent. Most important, prices and how they will vary with size were not discussed, and although capacity addition was touted, it's not clear how it will work.

With Netezza now promising less the 20K per terabyte of user data, and other new ADBMS vendors aggressively entering the market, IBM must be careful to make its value proposition more explicit. The claims of a 3x speed increase, and a 50% reduction in storage, are commonplace in the ADBMS marketplace today. Mill's point that "Labor cost and time are the limiting factors" opens a new dimension in the discussion. The promise of cutting those elements post-install by at least half is a powerful value proposition. The tough part for IBM will be proving it, and convincing prospects that the total cost is thus in line with the market - or better.

IBM also announced the Smart Analytics Optimizer, an add-on solution that will attach to a System z to boost analytic performance. It will exploit in-memory techniques (in response to a question, SSD was acknowledged as at least part of this), vector processing, parallel evaluation of query predicates. Again, details were difficult to come by, but mainframe users would be well served by such an offering.

What does this mean for users? A faster time to value, assurance that their system will perform near the peak of its theoretical capability, and ongoing management to ensure that the system does not drift away from excellence as inevitable scope creep, data creep, and inertia take their toll. IBM had some impressive user stories to tell, but of course those stories were about the pieces that will go into the new offering. The promised synergies were not yet referenceable, and will not be until some buyers take the Smart Analytics Systems for a spin and find out just how smart they are.


Posted August 6, 2009 8:21 PM
Permalink | No Comments |