Blog: Lou Agosta Subscribe to this blog's RSS feed!

Lou Agosta

Greetings and welcome to my blog focusing on reengineering healthcare using information technology. The commitment is to provide an engaging mixture of brainstorming, blue sky speculation and business intelligence vision with real world experiences – including those reported by you, the reader-participant – about what works and what doesn't in using healthcare information technology (HIT) to optimize consumer, provider and payer processes in healthcare. Keeping in mind that sometimes a scalpel, not a hammer, is the tool of choice, the approach is to be a stand for new possibilities in the face of entrenched mediocrity, to do so without tilting windmills and to follow the line of least resistance to getting the job done – a healthcare system that works for us all. So let me invite you to HIT me with your best shot at LAgosta@acm.org.

About the author >

Lou Agosta is an independent industry analyst, specializing in data warehousing, data mining and data quality. A former industry analyst at Giga Information Group, Agosta has published extensively on industry trends in data warehousing, business and information technology. He is currently focusing on the challenge of transforming America’s healthcare system using information technology (HIT). He can be reached at LAgosta@acm.org.

Editor's Note: More articles, resources, and events are available in Lou's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in open source Category

I had a chance to talk with Yves de Montcheuil, VP of Marketing, about current events at Talend and its vision of the future.

Talend addresses data integration across a diverse array of industry verticals. Its inroads in healthcare will be of interest to readers of this blog. As noted elsewhere, healthcare is a data integration challenge ( healthcare data integration). For example, at Children's Hospital and Medical Center of Omaha (NE), heterogeneous systems are the order of the day. The ambulatory EMR generates tons of documents. These need to be added to its legal medical record system, MedPlus Chartmaxx. On occasion, some of those documents error out before being captured to the patient's chart in Chartmaxx. This is clinical information impacts clinician decision making, and must be filed to the appropriate patient's record in a timely manner, supporting patient care quality. Talend synchronizes such processes across clinical systems. It providers data transformations, notifications and, in this case, exception processing, furnishing a level of functionality that previously required a larger and more expensive ETL tool from a larger and more expensive software vendor. This is the tip of the iceberg; and Talend is now the standard at the enterprise for data integration and data quality. This is obviously also the process in which to perform data quality activities - data profiling, data validation, and data correction. Data validation occurs inside the data stream, and any suspect data is flagged and included in a report that is then processed for reconciliation. The ability to perform data quality controls and corrections across them makes the processing of data faster and smoother. It should be noted that, although I drilled down on this example, Talend has numerous high profile wins in healthcare (accessible on its web site here.)

logo_talend-open-data-solution.jpg

Taking a strategy from the play book of its larger competitors, but without the pricing mark up, Talend is developing a platform that includes data quality in the form of Talend Data Profiler and Talend Data Quality, the latter, of course, actually able to validate and correct the errors surfaced. The obvious question is what is the next logical step?

Several possibilities are available. However, the one engaged by Talend - and its a good one - is the announcement (here) of the acquisition of a master data management (MDM) software firm, Amalto Technologies, and plans to make it a part of its open source distribution in 2010. This is a logical move for several reasons. First, data integration and data quality (rationalization) are on the critical path to a consistent, unified view of customers, products, providers, and whatever master data dimensions turn you on. The data warehouse is routinely referred to as a single version of the truth. Now it turns out that there is no single version of data warehousing truth without a single version of customer, product, location, and calendar (and so on) truth to support the data warehouse. (This deserves a whole post in itself, so please stand by for update on that.)

While the future is uncertain, I am betting on the success of Talend for several reasons. First, the approach at Talend - and open software in general - simplifies the software acquisition process (and this regardless of any price consideration). Instead of having to negotiate with increasingly stressed out (and scarce) sales staff, who need to qualify you as a buyer with $250K or $500K to invest, the prospect sets his own agenda, downloading the software and building a prototype at its own pace. If you like the result and want to scale up - and comments about the quality of the software are high, though, heavens knows, like any complex artifact, there is a list of bug fixes - then a formal open source distribution is available - for a fee, of course - with a rigorous, formal service level agreement and support. Second, according to Gartner's November 25, 2009 Magic Quadrant for Data Integration, available on the Talend web site for a simple registration, Talend has some 800 customers. I have not verified the accuracy of this data, though there are logos aplenty on the Talend web site, including many in healthcare, and all the usual disclaimers apply. Talend is justifiably proud and is engaging in a bit of boasting here as open source gets the recognition it has for some time deserved Third, Talend is turning the crank - in a positive sense of the word - with a short cycle for enhancements, currently every six months or so. With a relatively new and emerging product, this is most appropriate, though I expect that to slow as functionality reaches a dynamic equilibrium a couple of years from now. There are some sixty developers in China - employees of Talend, not out sourced developers - reporting to a smaller design/development team of some 15 architects in France. Leaving aside the formal development of the defined distribution of the software for the moment, the open source community provides the largest focus group you can imagine, collecting and vetting requests and requirements from the community. As in so many areas of the software economy, Talend is changing the economics of data integration - and soon MDM - in a way that benefits end-user enterprises. Watch for the success of this model to propagate itself virally - and openly - in other areas of software development.  Please let me hear from you about your experiences with Talend, data integration, and open source in all its forms.


Posted December 11, 2009 9:47 AM
Permalink | 2 Comments |

"VistA" is the Veterans Information System Technology Architecture, an open source, run-your-hospital software platform that, defying all expectations, has been a runaway success in bringing order out of chaos, reducing costs, and promoting best practices in the Veterans Administration Healthcare System (which serves some six million former service men and women). VistA was - and still is - a strong candidate as a model to be scaled up and out on a nation-wide basis.

 

This is surprising for at least three reasons. First, healthcare is a risk averse industry - rather like the airlines in having no room for operational error - yet this is a open source software success story. Second, MUMPS as a file system and data store is not exactly the most innovative approach. It has a reputation for being blazingly fast in handling data, but very challenging in terms of data access. The fast process is traded off at the back end in difficult data access and delivery. How about a standard relational database with in memory caching? Or a column-oriented data store for analytics? In short, the latter are emerging possibilities and Oracle and Cache? are apparently coming on stream as options. Third, VistA offers over 130 clinical modules. It is in production at 150 medical centers, 850 clinics, and supports 15,000 physicians. Some 85% of all physicians in practice are exposed to VistA. From a governance and system development point of view, it is hard to know what to do. Suppose you require a laboratory system - what are you going to do? Propose that the developers start coding and the rest will be spontaneously evolved as a rule of variation and natural selection of the code itself in the course of what? Twelve months? Eighteen months? In a business where large software implementations have a history of being routinely over-budget and behind schedule, you just might want to take a flier on such an open software approach. Yet imagine try explaining it to management or to a Congressional committee tasked with making sure veterans are well served. Here's the rock and her's the hard place. Which will it be today?

 

Like Linux and Apache, Vista has benefited from the evolutionary approach of open software development. According to hackavist - that rhymes with "activist" - Fred Trotter, Vista was developed by programming pairs - not two coders, but one coder and one clinician. The coder automated the clinical expertise articulated by the local hospital provider. Each local VA hospital developed an alternative solution and the best one was distributed through the system in a process of Darwinian variation and natural section where the most robust solution survived and prospered. The whole process is more complicated than can be documented in any blog post, but is exceptional in not being centralized, top down, or orderly in the standard water fall model of software development.

 

As noted, this has presented a challenge as executives and managers - including multi-star generals in the US Armed Forces who are the equivalent of CxOs for the VA -  reach out to perform their fiduciary oversight to the process. It is also a challenge to those who see Vista as a model scalable across the entire civilian healthcare system - or at least a significant subset of it. In short, governance issues now loom large. Efforts are underway at the VA by management to centralized Vista development, thus marginalizing the innovative energies of the local VA participants. In a parallel and generally positive universe, some key developers have left the VA and formed Medsphere which is a private corporation committing investment dollars to evolving and selling Vista to commercial hospitals and private healthcare providers. However, the momentum seems to have gone out of the Vista innovations at the VA, which reportedly has lost its groove.

 

In November 2007 - so this is old news - Cerner was awarded a nine (9) years contract to deploy and use Cerner Millennium PathNet laboratory information system in some 150 hospitals and 800 clinics in the Veterans Health Administration system. This creates a challenging mixture of the so-called proprietary and open systems models that has open source advocates sputtering metaphors about amputating living limbs and replacing them with wooden prostheses. I don't think so. Still, it did get my attention when my own personal physician, a practitioner at a large teaching hospital, spontaneously praised VistA and decried its loss of momentum. He had no knowledge of my interest in the matter - none - or that I was writing an article on it. Of course, this is just anecdotal evidence, but, in any case, VistA has a certain buzz about it. And even if a given software module is not the optimal solution for a given set of requirements, the power of open source to drive down costs while improving software quality is demonstrated time-and-again. Granted, software is often initially purchased due to the diversity of features and functions, in which standard proprietary software solutions often excel. However, software is upgraded, maintained, and renewed due to the usability of a few key features that end-users cherish and cannot live without. In the latter area, open source is as strong a contender as any others in the field.

 

Regardless of the dynamics, open source offers a compelling value proposition. When the code is made the target of a methodological support process with 7x24 support - all richly compensated with an appropriate fee structure - the operational risks are no greater (or less) than any other code, whether proprietary or open. So what are you supposed to do about it? Call your Congressman? I would not rule it out. Send him the URL to this posting - and let me know what you think.


Posted August 6, 2009 6:30 AM
Permalink | No Comments |