We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: John Myers Subscribe to this blog's RSS feed!

John Myers

Hey all-

Welcome to my blog. The fine folks at the BeyeNETWORK™ have provided me with this forum to offer opinion and insight into the worlds of telcommunications (telecom) and business activity monitoring (BAM). But as with any blog, I am sure that we (yes we... since blogging is a "team sport"...) will explore other tangents that intersect the concepts of telecom and BAM.

In this world of "Crossfire" intellectual engagement (i.e. I yell louder therefore I win the argument), I will try to offer my opinion in a constructive manner. If I truly dislike a concept, I will do my best to offer an alternative as opposed to simply attempting to prove my point by disproving someone else's. I ask that people who post to this blog follow in my lead.

Let the games begin....

About the author >

John Myers, a senior analyst in the business intelligence (BI) practice at  Enterprise Management Associates (EMA). In this role, John delivers comprehensive coverage of the business intelligence and data warehouse industry with a focus on database management, data integration, data visualization, and process management solutions. Prior to joining EMA, John spent over ten years working with business analytics implementations associated with the telecommunications industry.

John may be contacted by email at JMyers@enterprisemanagement.com.

Editor's note: More telecom articles, resources, news and events are available in the BeyeNETWORK's Telecom Channel. Be sure to visit today!

Recently in Business Intelligence Category

In the last 24 months, there has been a trend toward the acquisition of analytical databases to augment an established DBMS’ existing product line.  For example, the following mergers and acquisitions were compiled by Doug Henschen in April:

  • EMC buys Greenplum July 2010
  • IBM buys Netezza November 2010
  • Hewlett-Packard buys Vertica March 2011
  • Teradata buys Aster Data April 2011

Many of these acquisitions were driven by the reality that you really need to use the correct DBMS architecture for the right job rather than taking a one size fits all approach.  However the question then becomes how do you bring that data together once it resides on those separate platforms?  One solution is to use the concept of

CompositeThis week Composite will release its next generation platform for data virtualization: Composite Information Server 6.0.  This platform allows organizations to make data decisions based on the best platform for the job rather than pushing all data to a particular platform.

Bring Big Data into the Fold

BigDataVirtualizeOne of the best uses of the enhanced platform is the ability to virtualize big data sources like Hadoop, Netezza and SAP into a seamless environment.

Using the Composite “optimizer” functionality, organizations can take advantage of the relatively new big-data processing environments without delaying the “time to value” of those new data sources into existing implementations. This will be particularly important as organizations begin to ingest data sets like social media interactions; RFID sensor information; and other big-data sources that haven’t matured sufficiently to including in existing data environments, but still have excellent value to the organization.

Telecom Take: Use the Right Tool

DataVirtualizeAs telecom organizations make moves to integrate multiple data sources to enable their “single view of the customer” associated with customer experience management as well as spreading customer support to centralized call centers; telecoms will need a much more robust ability to have consistent and timely data spread across those locations.

For customer experience management, telecoms will need to have proper data virtualization to avoid the age old question from calls to the call center:

“Shouldn’t you already know about my orders and account information?”

For call centers, to provide flexible access to similar data sets across operational (ie billing), analytical (ie fraud management) and external data sources (ie credit reports); a robust virtualization environment will allow for flexible scheduling of call center resources not only in one location but across many without customers having to hear:

“Sorry I don’t have that in my system…”

All in all, I believe that the continued advances in the Composite virtualization suite make it one of the better options for telecoms to overcome the legacy (network, billing) and ‘next generation’ (social, geo-spatial) data silos that seem to impact telecom organizations more than others.

Posted June 6, 2011 8:22 AM
Permalink | 1 Comment |

As larger data sets start to take root across various industries, it is going to be important to put those “big-data” results into a more manageable picture for end users and analysts.  Many of the existing “big-data” end users are already familiar with the data sets and how they wish to look at those data sets.

However, the true value of “big-data” or analytics on “big-data” is going to be presenting the information to the end user who may still be thinking about analytics in “small-data” ( … or relatively small data… ) terms.

For example, new “big-data” analytics provides a “richness” of information and an increase of the dimensions that “small-data” systems cannot match.  Yet, many users in marketing or product management may not understand how to make the leap from “big-data” aggregates to “big-data” detail because they don’t have the context of the “big-data” detail(s) they are looking at.

Mixing and Match with Big-Data

The twin challenge associated with the ability to handle and analyze “big-data” is the ability to put that analysis into context.  “Big-data” often refers to senor, geographic or application data.  However, not many people in end user/analyst communities have the ability make the leap from those “big-data” details to an end “so what picture?”.

This week Tableau announced the next edition to their business intelligence / data visualization product line – Tableau 6 – which supports the ability to “blend” data sets for end user visualizations that will tell the story that marketing and product management will understand and have that “AHA!” moment.  While the data visualization is nothing “new”, the ability to perform with “big-data” data sets will be the key aspect.  If the visualization takes too long, the marketing analysts and product management teams will lose interest and use less detailed analysis tools. 

Telecom Take

As telecom data rockets further and further for social media, location based services and overall smartphone usage; “big-data” is going to hit head long into telecom BI/DW teams.  And while those teams are struggling with the ingestion of the data, end users are going to demand analytics and visualization tools that don’t hold back their “day jobs” from being completed…

Using data visualization tools, like Tableau’s new offering, will offer the ability to match the potential of the data with promise of the analysis. 

How is your telecom BI/DW team positioned to meet end user requirements for visualizing big-data? Strictly using aggregates? or big-data detail?

Post your comments below or email (John.Myers@BlueBuffaloGroup.com) / twitter (@BlueBuffaloGrp) me directly.

Posted November 10, 2010 3:00 PM
Permalink | No Comments |

At this week's TDWI Conference in Orlando, the focus is on Emerging Technologies.  The Monday Keynote presentation focused on the ability of an organization to use "nimble" development practices ( ... as opposed to Agile methodologies... ) and cloud based technologies to enable quick results.

"When the CEO comes knocking..."

Kevin Rooney's keynote presentation focused the results of an effort where a company CEO wanted to understand his company's position within particular insurance markets and how to increase the company's position within the market place via customized/optimized price points.  However, many questions "loomed" over the effort:

  • Could all the publically available data be used effectively?
  • Was there value in effort?

Rooney developed a nimble response team within his IT organization that tackled the issues of determining if the data available/feasible and if the business model was possible.  NOTE - Rooney used a technique that Harvard Business Review has advocated for reducing the tension between existing "legacy" and breakthrough innovation teams.

Rooney also reached out to the team at Kognitio for an analytical platform that would allow for an initial "proof of value" and a minimal capital expense (capex) rollout into production as well as a powerful analytical platform to perform the types of queries required of the effort.

In this, Rooney linked a nimble development team with a power and flexible analytical engine to develop a competitive advantage application for his CEO is a timeframe that allowed his firm to capitalize on the opportunity and develop areas of competitive advantage.

Telecom Take

With the stated goal of many of the major telecoms utilize metered billing plans in the future, telecom organizations need to be flexible in their approach to understanding how those billing models will impact profits.  It will no longer be appropriate to set metered or utility plans and then 'see how they do'.  Telecoms will need to be flexible with "what if" management scenarios via either descriptive or predictive analytics to provide analysis on which plans will be profitable and which will not.

NOTE - BT has a long history of doing this type of analysis.  However recent developments relating to increased smartphone usage and the need for more flexbile pricing models will drive an increased need for this type of work. 

Powerful analytical platforms like Kognitio will be part of the solution.  However, it will be forward looking analyst organizations that make these solutions possible.  Those analyst organizations, with telecom BI/DW teams, can utilize tools within a nimble analysis cycle to implement valuable projects. 

How is your telecom organization handling flexible metered billing situations? Reactively with spreadsheets or proactively with "what if" models?

Post your comments below or email (John.Myers@BlueBuffaloGroup.com) / twitter (@BlueBuffaloGrp) me directly.

Posted November 9, 2010 12:15 PM
Permalink | No Comments |

Everyone is talking about a data explosion:

  • RFID information in Retail environments
  • Social media interactions via wireless
  • Behavioral event data via eCommerce
  • etc

All of this is leading toward the era of “big-data”.  Most would say that the “big-data” era is already upon us.  Some would say that future data loads will dwarf current requirements just as current numbers dwarf the past 5–10 years.

However, the key to the “big-data” era will not be in the simple accumulation of data in business intelligence and data warehousing (BI/DW) environments, but the utilization of that data across the organization.

Deep Analytics on Big Data

TdwiLogoThis week The Data Warehousing Institute (TDWI) held its initial solution summit on the topic of “big-data” in San Diego: Deep Analytics for Big Data.  It was a gather of decision makers and leading vendors to discuss the topic of “big-data” and the future of analytics associated with those “big-data” BI/DW environments.

Chief among the discussion topics were how to make the correction decisions on:

  • Building “big-data” environments in a greenfield environment
  • Transitioning from existing BI/DW environments to support “big-data”
  • Hybrids to support existing datasets and the “new”, larger requirements

Solutions.  Not just Problems.

During the solution summit, customer implementation case studies were provided by vendors that highlighted the issues with “big data” BI/DW engagements.

Telecom Take

Telecom organizations are on the front line of “big-data” analytics.  Wireless voice, SMS and IP-base product data are at the core of the “new” business models for both carriers and organizations looking to capitalize on new business models…

Think about it… All the information that Google uses for their targeted web-based advertising transits a telecom network at some point.  iTunes would not be possible without the networks to pass content to either the tablet or smartphone.  Carriers need to use their knowledge of the network events linked with customer information to either get a leg up on companies like Google and Apple.

Yet, carriers should exercise good judgment with all that “big-data”.  Privacy laws associated with customer information are only going to become more stringent in the future as uses like telematics and location-based services take hold.

How is your telecom organization tackling “big-data”? Reactively or with strategy?

Post your comments below or email (John.Myers@BlueBuffaloGroup.com) / twitter (@BlueBuffaloGrp) me directly.

Posted October 6, 2010 7:00 PM
Permalink | No Comments |

In a recent keynote address, Marc Demarest talked about the need for increased decisioning associated with “big data”.  Whether it be complex event processing (CEP) or streaming analytics, the ability to make timely decisions on the analysis of “big data” sets is limited when you place a human element in the critical path.  Not only will bottlenecks occur, but more than likely the data will move so fast that no decisions will be made.

Rules to Live By

Being able to automate the decisions that need to be taken from “big data” analysis is going to be the key for many industries.  Business intelligence/data warehouse (BI/DW) professionals will not be able to place alarms or workflows before a human analyst or operational team.  This will come from the fact that 8am-6pm time windows will not be sufficient for the decisions that need to be made and the fact that not all human interactions will know what to do or the extent of what needs to be accomplished.

HPLogoSteve Pratt of HP’s Business Intelligence Solutions group presented at the recent TDWI Deep Analytics for Big Data Solution Summit for the need to automate decisions in the healthcare industry.  His reasoning was that the complexity of healthcare decisions and the need to make the timely decisions at the point of interaction was key to improving the quality of healthcare and the reducing the cost.  Improved quality would come from leveraging standard practices and by providing the proper care “further up the food chain” and thus eliminating rework later on.

Exemplifying these concepts a common situation in pharmacies.  With each prescription that a pharmacist fills, analysis and decisions need to be made about standard medical practices (ie drug interactions) and standard business practices (ie payments, deductibles).  By automating this analysis at the point of sale, healthcare can become safer and less expensive. 

However, this does not come without cost.  The ability to institute these decisions is not as simple as implementing a database trigger.  The implementation is more than an “if-then” statement.  Often the automated analysis and related decision is more than a single action path.  Issues of this complexity take more time than other types of analysis.  Below is Pratt’s analysis of where automated decisions fit on a time to value graph.


Telecom Take

For telecom organizations, the stakes are not same as they would be in healthcare.  However, automated decisions on “big data” have similar needs.  Both telecom costs and revenues will be impacted.

Network health in the future will have greater importance as the implementation of enterprise level service level agreements (SLA) move toward consumer relationships.  Both fiber-to–the home (FTTH) and wireless connectivity may soon have up-time/connectivity obligations as more and more aspects of daily life depend on IP-based connectivity.  Imagine the issues associated with FTTH downtime on IPTV products during special events like the Super Bowl.

Product pricing and availability are already moving toward speeds that humans have hard time comprehending.  In some African nations, pre-paid wireless revenue models are moving toward a per-tower pricing structure.  Imagine attempting to control, or even worse validate, call pricing at the tower level using primarily human based decision controls.

Which decisions in your telecom environment are managed with automated decisions?

Post your comments below or email (John.Myers@BlueBuffaloGroup.com) / twitter (@BlueBuffaloGrp) me directly.

Posted October 6, 2010 6:00 PM
Permalink | No Comments |
PREV 1 2 3 4 5 6 7

Search this blog
Categories ›
Archives ›
Recent Entries ›