We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: Merv Adrian Subscribe to this blog's RSS feed!

Not Pictured

Welcome to my BeyeNETWORK blog! Please join me often to share your thoughts and observations on new analytic platforms, BI and data management. I maintain a vendor-focused practice that uses primary research, briefings, case studies, events and other activities that stimulate ideas as a source for commentary on strategy and execution in the marketplace. I believe the emergence of a new class of analytic platforms, and emerging data management and advanced tools herald a next step in the maturity of information technology, and I'm excited to be present for its emergence. I hope my blog entries will stimulate ideas that will serve both the vendors creating these new solutions and the companies that will improve their business prospects as a result of applying them. Please share your thoughts and input on the topics.

 

 

July 2009 Archives

Not everyone in the software industry is suffering. Informatica Q2 revenues were $117.3 million, up 3% year over year, and license revenues for the second quarter were $48.7 million, relatively flat. That makes 19 quarters in a row - very impressive. Informatica added 65 customers in the quarter and now claims nearly 3800, with wins in multiple geographies. 

Informatica continues to be well regarded by analysts - Gartner, for example, made it a Leader in the latest magic quadrant report. It has worked steadily to add features with a clear and clearly articulated vision and roadmap. The recent acquisition of AddressDoctor enhances its data quality capabilities, offering upside opportunities in its base as well as an ever-broadening value proposition. Today's announcement for cloud computing moves Power Center into the Amazon EC2 cloud, ticking that checkbox as well.  

Clearly, the strategic thrust is working; 55% of Informatica's 100K-plus deals in Q2 were in applications "beyond data warehousing," says CEO Sohaib Abbasi, validating the portfolio approach Informatica has taken, and an example win over IBM and SAP Business Objects cited in the investor call drove that story home. Breadth sustains the books in difficult times: although its consulting revenues are down, Informatica gained substantially in maintenance revenue. In reaping the 95% maintenance renewal rate annuity of its still-growing base and increasing deal size, Informatica continued to set itself up for the future - 9 deals in the quarter were over a million dollars. These large deals drive increasing maintenance revenues for the future as well. Global reach helps too: Latin America and Asia are still growing well enough to be touted as offsetting unfavorable currency impacts.

Informatica claims that it is "is not feeling price pressure," despite competition from open source and other vendors. The company has been able not only to grow revenue, but also increase margins. While part of this is clearly due to the multiple offerings in the portfolio, it is enhanced by the degree to which they are synergistic with one another. Not only new (and larger deal) business, but upside in the installed base, result from this synergy.

Informatica has an increasingly competitive road ahead in light of the investments being made by IBM and more recently Oracle. But it has withstood the heat well so far, managed its expenses and grown margins even as it prepares for a major release at yearend. With an upturn expected, Informatica's prospects appear good. The battles ahead will be high stakes and highly visible, and it can be expected to win (and even grow) its share. Sohaib asserted in today's call that only 5% of its deals compete with Oracle, and he doesn't believe the Goldengate acquisition will change much - Informatica rarely competed directly with them. That leaves IBM, and there we can expect to see some fireworks. Battling portfolios ahead.


Posted July 30, 2009 4:52 PM
Permalink | No Comments |

Oracle today announced they are buying GoldenGate Software for an undisclosed sum. Goldengate may not be a well-known name, except in circles where transactional replication is a hot topic, but after 15 years in business, they had assembled a sizable base of some 500 customers, with 4000 solutions deployed, and partnerships with vendors beyond Oracle, including names as diverse as Teradata and Ingres on the database side, and Microstrategy and Amdocs in the app and BI space. Their message revolved around 3 key attributes of their changed-data-based replication technology: heterogeneity, real-time (log-based) performance, and high-volume transactional support (committed only.) And despite Goldengate's notoriously closed-mouthed approach to their finances, it's fair to say that they were generating tens of millions of dollars in revenue yearly (Hoover's says $9.7M in 2007, but I believe that's low). If Oracle invests even modestly to sustain and grow sales, this acquisition (price unspecified) should be a substantial win.

At its founding, GoldenGate focused on developing replication technology for high availability (HA), originally pointed at creating live standby copies for Tandem. (Today, the Tandem technology lives on in HP's NeoView database.) Financial customers like US Bank, Bank of America, Chicago Mercantile, and Visa came aboard, and are still clients. GoldenGate went to application vendors on the Nonstop platform, like ATM processing software vendors, and partnered with them to sell the product.

HA is still a key piece of the business; beginning with multi-node high availability for Teradata 4 years ago, GoldenGate continued to enhance that offering; it is being used by a number of customers for active/active configurations. A large network equipment manufacturer recently purchased GoldenGate for that purpose, and although Teradata apparently drove the deal, GoldenGate was brought in to participate actively and expected its relationship with Teradata will continute to drive future opportunities. One can expect that to nbe an open question for a while. It's not clear how that relationship, or for that matter the HA business, will figure in Oracle's future, but one doubts that helping Teradata was the key value Oracle sought. Where Teradata turns next will be an interesting question.

Another target market is the exploding data warehouse load/real-time reporting space, and GoldenGate had been courting vendors like Teradata, Netezza and Greenplum to become their recommended choice with some success. And after focusing on financial customers for many years, GoldenGate saw healthcare as a next opportunity. Companies there also want to offload critical data for reporting, and as they begin to demand software to target SQL Server or Oracle environments, companies like Cerner and GE Healthcare are selling partners in that space today. Telco is also promising. Is the fit right? Will prospects go with replication, or choose ETL? Use cases will dictate, and there's room for both. The likelihood that buyers will choose Oracle to ship data when other database vendors are the target platfrom may be minimal, but there are ample opportunities within the already vast Oracle base for new projects. And when Oracle is the source, the value proposition will be attractive.

Oracle's new acquisition thus has a well-understood competitive field to operate in. Goldengate had already claimed some success positioning against IBM. Despite IBM's acquisition of DataMirror in July 2007 for changed-data capture, the multiple IBM replication offerings are badly in need of more rationalized, coherent messaging. There are numerous IBM products - for zOS, Infosphere, WebSphere, IMS, DB2 Propagator - inside or outside IBM Information Server. Oracle will likely follow its aggressive, in-your-face marketing strategy to go after IBM soon, and the fireworks should be interesting.

Sybase is also a player here, and has new Oracle replication capability. They may have waited too long. Goldengate already claimed that Sybase tended not to be in the same deals they were in, and that's unsurprising considering the minimal attention Sybase has paid to heterogeneous opportunities. GoldenGate already had some success in Oracle shops, especially around Siebel migrations, and claimed a good relationship with the database team there. Oracle may well now forestall whatever erosion Sybase had hoped to capitalize on by competing with Oracle's old, less effective replication to its own database.

IT Market Strategy believes that GoldenGate has plenty of upside - if Oracle chooses to go after it.  A significant uptick in marketing visibility and a hugely larger sales force will pay rich dividends. The competition has left this market unexploited, and with Oracle's sizable direct sales teams, they can create substantial competitive pressure. We expect a nice bump in this business, and renewed discussion in the months ahead.


Posted July 23, 2009 9:39 AM
Permalink | No Comments |

Since my last post about Aster, the analytic DBMS (ADBMS) vendor has added another arrow to its quiver. While the company is still focused on large-scale data warehouses and companies who need more analytic power for advanced analytics/queries, its new MapReduce Data Warehouse Appliance Express Edition starts at $50,000, and includes Aster nCluster on Dell hardware and a copy of MicroStrategy BI software for up to 1 Tb of user data, which Aster clearly sees as a sweet spot.   (Microstrategy has been doing a lot of seeding with the ADBMSs lately; it also has  an introductory bundling deal with Sybase IQ.)  Delivering a 'compute rich' appliance on commodity hardware, with reduced operating costs, certainly hits all the right notes. Is 1 Tb  the sweet spot for MapReduce? No, although it makes a great starting point, and that is Aster's real opportunity - give 'em a taste of what SQL plus MapReduce can do, and watch them demand more and more. And sell it to them. Dell and MicroStrategy should love this strategy - if it works.

Even for those smaller data warehouses, speeds will be clearly improved. Lower first costs, ease of setup and administration - lowering both capital and operating expense - will lower the barriers to entry. $50K is a far cry from the half-million it can cost to get into other appliances from the "big boys."  Once the value is proved, stepping into Aster's Enterprise Edition, which it claims will scale to the petabyte range, may be easier to take.

Aster now has three ways to deliver nCluster software:

  1. software only
  2. the cloud (via Amazon and AppNexus)
  3. appliances 

This makes for a widely varying set of propositions to present to companies at very different points of entry, and should help broaden the opportunity base for Aster.There are certainly some questions:

  • What's the difference between a "data mart" and the "smaller data warehouse?"Aster quotes Gartner's Donald Feinberg about the latter in its press release.  Perhaps Aster is choosing to ignore the data mart moniker - although it's also possible that they are saying the improved generalized analytics of SQL plus MapReduce make it less necessary to restrict subjects and dimensions and follow specific architectural models the way many data marts typically do. If so, that will prove to be an interesting debate.
  • Are fault-tolerance and availability now "table stakes" for appliances? Aster is claiming "99.99% uptime, with reduced troubleshooting costs." ParAccel has touted its relationship with EMC for enterprise-class "-abilities." Other ADBMS vendors will need to keep up their features - and their rhetoric - here.
  • Is "SQL plus MapReduce" better enough to be a difference maker?Aster says that its "integrated SQL/MapReduce framework for analytics and BI increases query performance by 9x or more when compared with other SQL-only data warehouse appliances in the market." It has produced SQL/MR benchmarks vs standard SQL queries as part of a paper to be presented at the VLDB Conference in Lyon, France, later this Fall. The report is available here: http://www.asterdata.com/resources/downloads/whitepapers/sqlmr.pdf

Kudos to Aster for upping the heat, as well as shedding some light, in the emerging ADBMS wars. Aster opened a big door when it made MapReduce available to .NET, and no doubt some intriguing work will emerge from that community. Aster has a nice war chest to work with from its recent $17M Q1 financing round, and is putting it to work. So far the rhetoric has been aimed at Oracle, DB2, Teradata and Netezza. Easy targets. What about  Greenplum, Infobright, Kickfire, ParAccel, Sybase IQ, Vertica,...? It's going to be fun watching the smackdown ahead.


Posted July 21, 2009 10:06 AM
Permalink | No Comments |

I recently sat down for a talk with Miriam Tuerk, CEO of Infobright - an open source, commodity hardware-based analytic database (ADBMS) vendor focused on the data warehousing market. Infobright is another of the leaders in the open source information management wave IT Market Strategy has been tracking. Founded in 2006, Infobright has assembled a remarkable team now committed to exploiting this economic model to reduce the startup costs of data warehousing. Like other open source players, MySQL-based Infobright has two versions: a Community Edition (ICE, whose community gathers at www.infobright.org) and an Enterprise Edition (IEE). This bifurcation allows it to distribute starter software broadly at minimal direct cost, then upsell; along the way, it gets to tap into the vibrant innovation provided by the user community that forms. As the product matures, such vendors fund the more hardened features large firms require by charging them for those added capabilities that they need. And now (July 7), Infobright has partnered with Jaspersoft for tighter integration with a report server and OLAP analysis.

Infobright is privately held, with roughly 50 employees (in Canada, the US and Europe, where a development team in Poland does much of the heavy lifting. ) Investors including Sun and Flybridge Capital Partners injected a reported $10M into Infobright in 2008; the company doesn't discuss revenue, but considers funding "adequate through the end of 2010." I expect that they will seek additional funding  rounds as their infrastructure buildout continues.

Infobright moved into general release under a GPL license in September 2008. With three product releases under their belt, ICE boasts nearly 10,000 downloads and the company claims that 2,000 of them are active community participants. There are now over 60 paying customers in 7 countries. The new "integrated virtual machine download" announced today includes: ICE; the JasperServer Community Edition for report creation, delivery and scheduling; and JasperAnalysis, an OLAP server.

Infobright has implemented a column-oriented data store, deployed atop MySQL, as columns divided into 65,536 row elements known as Data Packs, which are compressed as they are stored - 10:1 compression was an early claim but the company says they frequently do far better. Statistics about the data (things like min/max, cardinality, etc.) are stored in a "Knowledge Grid" - essentially an indexing scheme, not unlike what vendors such as Illuminate use, that permits retrieval to be limited only to the data needed to resolve the specific query in question. Query tests in customer use cases routinely deliver sizable improvements in query times, as we have seen with other players in the new analytic DBMS space.

Infobright loads data quite rapidly on commodity hardware, and asserts that load speed will remain constant despite raw data size as a result of the architecture. The MySQL loader can be replaced with the Infobright loader in IEE to ensure high speed loads at scale. Infobright makes familiar assertions about "load and go"; certainly, with no careful designing of models, partitions and indexes, time to usage is significantly reduced. "Hardware setup and configuration can be done in a day," company marketing asserts. Infobright offers several different claims to scalability, including "to 50 Tb and more," inherits management tools and partnerships from its MySQL heritage, and also thus benefits from MySQL's ability to run on Linux, Solaris and Windows, and work with Ruby on Rails, PERL, Python, etc.  

The firm's new CTO Bob Zurek, who joined in Q2 2009, is an example of the seriousness Infobright brings to their commitment to enterprise-class offerings. Bob was most recently CTO and VP of Products at EnterpriseDB, after a distinguished career that includes stints at IBM , Ascential,  Sybase and Powersoft.  Partnerships are playing a key role, and Zurek's industry experience will no doubt have a big impact in working cooperatively with other OSS vendors and commercial ones. The company recently announced an "open source project for End-to-End business intelligence" with Jaspersoft's BI tools, ETL (Talend-based) and Infobright's DW at the MySQL conference.  Shortly thereafter, it unveiled a hardware and software system for the deployment of BI with Pentaho,  based on the Sun Fire X4275 storage server or the Sun Storage 7310 unified storage system. And today's announcement adds the Jasper report server and JasperServer OLAP piece for yet another configuration.

But announcements are not enough. True integration needs to be shown if the company wants to move into mainstream shops that don't want to do all the work themselves, and the degree of pre-integration is not clear just yet. To date, Infobright has signed up some 30 partners, and making all of the technology deals represent meaningful deliverables will take focus, experience, and some legwork to commnicate successes. But the funding is there, the experience is in place, and Infobright joins the battle with some strong assets. Download ICE and check it out - it's worth a look.


Posted July 20, 2009 9:02 AM
Permalink | No Comments |

Talend, a California-based  open source data integration vendor with a development center in China, first shipped product in late 2006, and two and a half years later has established a strong, growing business as more and more firms attempt to build a relatively complete stack of open source data management software. With a recent $12M round of financing, Talend continues to build out its commercial infrastructure, and can be expected to raise its profile and continue its growth in a conservative market that nonetheless is aggressively pursuing information management technologies. Open source, tight economics, prohibitively expensive licensing models based on data volume, and huge maintenance costs are transforming buyers' thinking about these products and opening the door for Talend and others.

Talend claims 900,000 "core product" downloads  have yielded 250,000 active (i.e. registered) users. And from there to over 500 paying customers in less than two years makes a good story - especially when Talend assert that a third comes from the Fortune 1000. An Eclipse-based product upgrade mechanism makes routine registration well worth it, and no doubt helps account for the relatively high download-to-registered-user ratio. As users move up to a full, priced relationship, they get enterprise capabilities such as multi-user support and load balancing, tech support, etc. The products offer graphical, business-oriented data modeling, data profiling, metadata discovery, connectivity to most widely-used systems and data sources - including SAS and SAP, cleansing capabilities, scheduling and more.

A surprise for me was Talend's assertion that half of its go-live projects - and a big piece of its differentiation - are in operational data integration (ODI) used for application upgrades, data migration and replication; the other 50% is considered BI by Talend. Based on research from IDC, TDWI and others, Talend is convinced that ODI represents a great market opportunity, and there is good reason to believe they're right. Its price advantage - no "data tax" based on volume, but rather a "number of active developers" pricing scheme - benefits greatly from this profile.

A paradox of ODI activities is that since they are less glamorous than BI-related projects, they are rarely staffed with visible, continuously employed specialists. Individual projects may be small and tactical, and often assigned to staff that don't remain specialized in data migration, data quality, or other related disciplines. Skills and reusable methods and code are not as easy to find inside the organzation. Enter open source, and a community model for collecting connectors and translation practices. Talend asserts that fully a third of its 400 connectors originated in the community - and are freely shared.

Partnerships are crucial in an integration-focused play, and Talend boasts a marquee list that includes open source stalwarts like Jaspersoft (who OEM the product), big brand partners such as Microsoft and Teradata, and system integrators such as Capgemini and Unisys.

Talend has formidable competitors - IBM and Oracle top the list. Fortunately, both have formidable prices too, and complex, massive offerings. Talend has been getting in under the radar a lot, and is likely to continue to until and unless the big firms start to rethink pricing. Nothing new here - the same model is being seen across the software industry as open source gathers credibility and momentum. The timing is perfect: spending constraints are tough, software asset management efforts are showing how much software is unused, and old-style licensing models are forcing companies to pay massive amounts of "maintenance" money for rarely used, rarely updated software. Talend is one of the leaders of the new wave, but they are not alone, and they will continute to benefit from the industry transfomation they are helping to drive.

[Quick add: I just came across this nice piece published by James Governor in January, which includes some added nuances about the use of the community for localization, about the investors helping to fund Talend, and some European distribution and potential market prospects. Recommended reading.]


Posted July 15, 2009 2:41 PM
Permalink | No Comments |

The near-decapitation (by acquisition) of the BI space in 2007-8 was perfectly timed for QlikTech, whose QlikView is rapidly becoming one of the leading independent products. This is hardly new; since its founding in Sweden in 1993, the company's timing has been unerring. Years of slow growth - "From 1993 to '99 we had a grand total of 5 customers, all in Sweden," Senior VP Anthony Deighton told me recently - ended abruptly with a funding round from a Swedish VC. That led to a revamped management team with an eye for growth, and in the next 6 years, QlikTech climbed to 1500 customers - still mostly in Sweden, though some were in Germany and a few were in the US. The next big bet was what Deighton calls "an over-investment in direct sales" based on a 2005 round of funding from some better known VCs. Since then, the takeoff has been remarkable - a happy timing of product, platform and market. With $120 million in revenue and 50% growth in 2008, Qlikview is reaping the benefits of effective timing, a conservative ramp that did not overreach, and a technology landscape that is paying off its visionary design. Now, wth QlikView release 9.0, it's targeting enterprise scale, better performance and manageability, and mobile deployment. As the economy begins to recover and mobile platforms proliferate, it appears QlikTech's timing is once again dead on.

QlikView's vision has been matched by 64-bit hardware and operating systems that can leverage its simple proposition: memory is faster, and visual interfaces appeal to the long-sought business user by offering "associative analysis." QlikView is positioned "on the right side of Moore's law," Deighton likes to say, and its current lead positions it well for some time to come.

"Of course, there's an element of luck. Our architectural model - associative, in-memory analysis - was ahead of  the market in 1999. Memory was limited and slow, and processors were expensive. What we did correctly was see it and not jump too early."

In its second stage, QlikView's VC backing allowed it to prove its repeatable sales model and its vision of the market segment it would pursue for the next 5 years, ramping to local success. "We proved the sales model, and showed that we had a product that works," Deighton says. In 2005, a $12.5M venture round funded its international growth and rampup, until today, with 10,500 customers and nearly half a million users in 92 countries, the firm is profitable and cash flow positive. QlikTech now has a formidable management team, 530 or more employees in 12 countries, and over 800 partners.  Its awards fill pages, its customers and partners describe themselves as "fans." In a market where studies have shown 80% of BI implementations take more than 6 months, QlikView's built-in data integration and intuitive interface have opened many doors - as well as its free download, "try-then-buy" model. QlikView has institutionalized a "Seeing Is Believing" model for its proof of concept activities, and has leveraged it to an impressive string of wins. The question of scale is comfortably in hand as customer stories involving terabytes of data - tens of millions of records or rows - and thousands of users continue to pile up.

By all accounts, QlikView's performance once in the door has anchored those accounts well.  Half of its business is from existing customers. And it's a product business: 90% is licenses; very little is services, reinforcing the "ease of implementation" story, while it drives most services to partners. The average deployment time (not project time, which is longer, as these things will be) is a couple of weeks, and QlikView offers a 30 day money back guarantee if they don't achieve a working production environment as promised. Design is still a significant challenge for all products in this category, though, and QlikView could certainly profit from some discovery-and-design modeling help.

QlikView's analytic interface is a revelation for first timers. The metaphor is focused on driving interaction. "Never tell the users they can't click on something," Deighton says. Typical displays show relevant data in context, showing grayed-out data that makes it clear what is NOT included in the display. As users move through styles and drilldowns, selections made follow them from tab to tab. For this observer, what was happening was "contextual analysis" - as new questions were asked and answered, the thread that led me there was not lost. I could backtrack and take a different tack, or even, with "trellis charts," look across a single dimension 3 different ways on the same chart to compare.

Recently, QlikTech has delivered QlikView 9, with more than 100 new features, including a new cloud deployment option via Amazon's Elastic Cloud Computing (EC2) platform - although not a SaaS model per se, as David Raab points out here: as more of the market shifts its pricing philosophy, this seems to be one place where QlikView's timing is not ahead of the curve. Still, the EC2 option lowers barriers to entry still further - even if a prospect doesn't have the hardware and storage for a new system, they can prototype in the cloud, and even opt to leave it there. In addition, QlikView Personal Edition is free, and allows QlikView Desktop to compete with the new open source competitors springing up.

Scalability and performance are also key directions, as larger deployments drive changes to the product. The previous limit of two billion rows per data set has been removed. Continuous update, load balancing features and optimization techniques help to ensure that complex queries run efficiently in clustered environments. A new control panel shows resource usage for all servers and components. QlikView has also added several new visualizations with QlikView 9, including charting features such as "spark lines", "whiskers", the trellis charts mentioned above, and live chart backgrounds. New global search functionality allows a user to search every QlikView field simultaneously with one click.

Finally, one of the sexiest introductions is the iPhone version. It makes effective use of familiar iPhone metaphors like coverflow, and supports GPS functionality, which should be a boon for a new class of mobile applications. (Since the iPhone version defaults to their demo server, you can see it in action immediately, by contrast to some other mobile BI products I've looked at.) The demo apps aren't perfect; I played with the wine selector for a while and when I got to a detailed list I was dismayed that I couldn't seem to zoom in using the ususal iPhone gestures to read it better. But that's execution, not product, I believe. In addition to the iPhone, other phones capable of running Java, like the BlackBerry and Symbian-based smartphones, have their own apps. The timing is right; I suspect this may become another takeoff market for Qlikview.


Posted July 10, 2009 1:02 PM
Permalink | No Comments |

Recently, ParAccel published a TPC-H benchmark, and I said here that it was a coup that ought to get them significant attention. The blizzard of discussion that ensued was no doubt gratifying for ParAccel - Google reported 182 hits for "the past week" for them as of 6/28.

Now, Google hits - and visibility in general - aren't everything. In a relatively crowded field, ParAccel will need more than just a fairly well-received press release - they will need money. Money to drive marketing, money to turn interest into leads, and money to fund a sales and field force to convert those leads into business. The good news? They just got some. On June 29th the firm announced a C round of venture capital has been secured, to the tune of $22 million led by Menlo Ventures; ParAccel's previous investors participated as well.

Obviously, the substantial new funding is good news; in addition to the availability of funds, it implies that due diligence by some savvy folks concluded that even in difficult times, ParAccel is worth investing in. Menlo's John Jarve will join ParAccel'sboard. I talked to CEO David Ehrlich about the new round and what he expects to do with the funds. He told me:

"We went slowly until the product was ready. Our feature/functionality is fairly complete, and we've demonstrated that our price/performance is superior. We think we have an offering that for most environments is the leader. Now we can begin to ramp it up."

Fighting words, but Ehrlich is spoiling to get into it. "Most of our activity has been POCs so far, and we've never been outperformed in a customer environment," he asserted.  He says that ParAccel will continue to invest in product; IT Market Strategy expects to see a new product release coming up very soon. But most of the acceleration in the near term will be in Sales and in the field organization, he says.

"Till very recently, we had one sales person - we wanted to take busines at a measured pace. Now we can scale the organziation so we can take care of customers they way we want to."

ParAccel has already gone from that single salesperson to 3 - they are represented in eastern, western and central North America now. "The only throttle is that we set a high bar - we spoke to over a hundred for the two we hired," Ehrlich says.  They have several more field spots open:  SEs and field reps. They will stay domestically focused for now, although Ehrlich has seen interest from overseas distributors. The Support organization right now is robust - ParAccel has 10% of its 60-person headcount in support-related positions and Ehrlich believes they can scale through the next couple of dozen customers without much new hiring.

But for a growing firm, there are certainly needs beyond sales and field organization. In the same period ParAccel saw its 182 hits, Vertica registered 222 with no announcements. Greenplum saw 193 following a launch effort the week before, and InfoBright got 526 on the heels of an announcement of their own. Sybase IQ, the leader in the analytic DBMS space, had 916. leveraging the momentum will take execution and a little spending of its own. Driving visibility, and turning it into leads, will clearly need to be a focal point.

Partner relationships will also become more significant. Ehrlich is very upbeat about ParAccel's continuing EMC partnership: "Our patent-pending SAN integration gives us an advantage. EMC (and other storage vendors) have competition to watch for from Netezza, Teradata, and Oracle's Exadata. We're a good option for any SAN vendor to deliver a high performance analytic DBMS with all the enterprise characteristics." The technology helped drive the Price Chopper win EMC and ParAccel recently won.

So, the game's afoot. Several hot new companies, all only a few years in the market, will be duking it out for mindshare and customers in the suddenly exciting analytic DBMS space. Looks like it's going to be a hot summer


Posted July 6, 2009 1:38 PM
Permalink | No Comments |