In the last 24 months, there has been a trend toward the acquisition of analytical databases to augment an established DBMS’ existing product line. For example, the following mergers and acquisitions were compiled by Doug Henschen in April:
- EMC buys Greenplum July 2010
- IBM buys Netezza November 2010
- Hewlett-Packard buys Vertica March 2011
- Teradata buys Aster Data April 2011
Many of these acquisitions were driven by the reality that you really need to use the correct DBMS architecture for the right job rather than taking a one size fits all approach. However the question then becomes how do you bring that data together once it resides on those separate platforms? One solution is to use the concept of
This week Composite will release its next generation platform for data virtualization: Composite Information Server 6.0. This platform allows organizations to make data decisions based on the best platform for the job rather than pushing all data to a particular platform.
Bring Big Data into the Fold
One of the best uses of the enhanced platform is the ability to virtualize big data sources like Hadoop, Netezza and SAP into a seamless environment.
Using the Composite “optimizer” functionality, organizations can take advantage of the relatively new big-data processing environments without delaying the “time to value” of those new data sources into existing implementations. This will be particularly important as organizations begin to ingest data sets like social media interactions; RFID sensor information; and other big-data sources that haven’t matured sufficiently to including in existing data environments, but still have excellent value to the organization.
Telecom Take: Use the Right Tool
As telecom organizations make moves to integrate multiple data sources to enable their “single view of the customer” associated with customer experience management as well as spreading customer support to centralized call centers; telecoms will need a much more robust ability to have consistent and timely data spread across those locations.
For customer experience management, telecoms will need to have proper data virtualization to avoid the age old question from calls to the call center:
“Shouldn’t you already know about my orders and account information?”
For call centers, to provide flexible access to similar data sets across operational (ie billing), analytical (ie fraud management) and external data sources (ie credit reports); a robust virtualization environment will allow for flexible scheduling of call center resources not only in one location but across many without customers having to hear:
“Sorry I don’t have that in my system…”
All in all, I believe that the continued advances in the Composite virtualization suite make it one of the better options for telecoms to overcome the legacy (network, billing) and ‘next generation’ (social, geo-spatial) data silos that seem to impact telecom organizations more than others.
Posted June 6, 2011 8:22 AM
Permalink | 1 Comment |