Blog: Lou Agosta Subscribe to this blog's RSS feed!

Lou Agosta

Greetings and welcome to my blog focusing on reengineering healthcare using information technology. The commitment is to provide an engaging mixture of brainstorming, blue sky speculation and business intelligence vision with real world experiences – including those reported by you, the reader-participant – about what works and what doesn't in using healthcare information technology (HIT) to optimize consumer, provider and payer processes in healthcare. Keeping in mind that sometimes a scalpel, not a hammer, is the tool of choice, the approach is to be a stand for new possibilities in the face of entrenched mediocrity, to do so without tilting windmills and to follow the line of least resistance to getting the job done – a healthcare system that works for us all. So let me invite you to HIT me with your best shot at

About the author >

Lou Agosta is an independent industry analyst, specializing in data warehousing, data mining and data quality. A former industry analyst at Giga Information Group, Agosta has published extensively on industry trends in data warehousing, business and information technology. He is currently focusing on the challenge of transforming America’s healthcare system using information technology (HIT). He can be reached at

Editor's Note: More articles, resources, and events are available in Lou's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in meaningful use Category

Datawatch provides an ingenious solution to information management, integration, and synthesis by working from the outside inwards. Datawatch's Monarch technology reverse engineers the information in the text files that would otherwise be sent to be printed as a hardcopy, using the text file as input to drive further processing, aggregation, calculation, and transformation of data into usable information. The text files, PDFs, spreadsheets, and related printer input become new data sources. With no rekeying of data and no programming, business analysts have a new data source to build bridges between silos of data in previously disparate systems and attain new levels of data integration and cohesion.


datawatch (5).JPG

For those enterprises running an ERP system for back office billing such as SAP or a hospital information system (HIS) such as Meditech, the task of getting the data out of the system using proprietary SAP coding or native MUMPS data store can be a high bar, requiring custom coding. Datawatch intelligently zooms through the existing externalization of the data in the reports, making short work of opening up otherwise proprietary systems.


Note that a trade-off is implied here. If your reporting is a strong point, Datawatch can take an installation to the next level, enabling coordination and collaboration, breaking down barriers between reporting silos that were previously impossible to bridge and doing so with velocity. Programming is not needed, and the level of difficulty is comparable to that of managing an excel spreadsheet targeting a smart business analyst. However, if the reports are inaccurate or even junk, even Datawatch cannot spin the straw into gold. You will still have to fix the data at its source.


Naturally, cross functional report mining works well in most verticals extending from finance to retail, from manufacturing to media, from the public sector to not for profit organizations. However, what makes healthcare a particularly inviting target is the relatively late and still on-going adoption of data warehousing combined with the immediate need to report on numerous clinical, quality and financial metrics such as the pending "Meaningful Use" metrics created via the HITECH Act. This is not a tutorial on meaningful use; however, further details can be found in a related article entitled "Game on! Healthcare IT Proposed Criteria on 'Meaningful Use' Weigh in at 556 Pages" click here. One of the goals of "meaningful use" in HIT is to combine clinical information with financial data in order to drive improvements in quality care, patient safety and operational efficiency while simultaneously optimizing cost control and reduction. The use of report mining and integration of disparate sources also allow the healthcare industry to migrate towards a pay-for-performance model, whereby providers will be reimbursed based on the quality and efficiency of care provided. However, financial, quality, clinical metrics and the evolving P4P models all require cross functional reporting from multiple systems. Even for many modern hospital information systems (HIS) that is a high bar. For those enterprises without an enterprise-wide data warehousing solution, no one is proposing to wait three to five years for a multi-step installation prior to learning the needed data still requires customization. In the interim, Datawatch has a feasible approach worth investigating.


In conversations with Datawatch executives John Kitchen (SVP Marketing) and Tom Callahan (Healthcare Product Manager), I learned that Datawatch has more than 1,000 organizations in the healthcare sector using Datawatch technology. Datawatch is surely a well kept secret, at least up until now. This is a substantial resource for best practices, methods and models, and lessons learned in the healthcare area. Datawatch can leverage these resources to its advantage and the benefit of its clients. While this is not a recommendation to buy or sell any security (or product), as a publicly traded firm, Datawatch is well positioned to benefit as the healthcare market continues its expansion. Datawatch provides a compelling business case with favorable ROI from the time of installation to the delivery of problem-solving value for the end user client. The level of IT support required by Datawatch is minimal, and sophisticated client departments have sometimes gone directly to Datawatch to get the job done.


Let's end with a client success story in HIT. Michele Clark, Hospital Revenue Business Analyst, Los Angles based Good Samaritan Hospital, comments on the application of Datawatch's Monarch Pro: "We simply run certain reports from MEDITECH's scheduling module, containing data for surgeries already scheduled, by location, by surgeon. We then bring those reports into Monarch Pro. Then, in conjunction with its powerful calculated fields, Monarch allows us to report on room utilization, block time usage and estimated times for various surgical procedures. The flexibility of Monarch to integrate data from other sources results in a customized, consolidated dataset in Monarch. We can then analyze, filter and summarize the data in a variety of ways to maximize the efficiency of our operating room resources. Thanks to Monarch, we have dramatically improved the utilization of our operating rooms, can more easily match available surgeons with required upcoming procedures, and better manage surgeon time and resources. Our patients are receiving the outstanding standard of care they expect, while we make the most of our surgical resources. This kind of resource efficiency is talked about a lot in the healthcare community. With Monarch, we are achieving it."  This makes Datawatch one to watch.

Posted July 1, 2010 9:59 AM
Permalink | 1 Comment |

Datameer takes its name from the sea - the sea of data - as in the French la mer or German, das Meer.


I caught up with Ajay Anand, CEO, and Stefan Groschupf, CTO. Ajay earned his stripes as Director of Cloud Computing and Hadoop at Yahoo. Stefan is a long-time open source consultant, and advocate, and cloud computing architect from EMI Music.


Datameer is aligning with datameerlogo.JPGthe two trends of Big Data and Open Source. You do not need an industry analyst to tell you that data volumes continue to grow, with unstructured data growing at a rate of almost 62% CAGR and structured less, but a still substantial 22% (according to IDC). Meanwhile, open source has never looked better as a cost effective enabler of infrastructure.


The product beta is launched with McAfee, nurago, a leading financial services company and a major telecommunications service provider  in April with the summer promising to deliver early adopters with the gold product shipping in the autumn. (Schedule is subject to changes without notice.) 


The value proposition of Datameer Analytics Solution (DAS) is  helping users perform advanced analytics and data mining with the same level of expertise required for a reasonably competent user of an Excel spreadsheet.


As is often the case, the back story is the story. The underlying technology is Hadoop. Hadoop is an open source standard for highly distributed systems of data. It includes both storage technology and execution capabilities, making it a kind of distributed operating system, providing a high level of virtualization. Unlike a relational database where search requires chasing up and down a binary tree, Hadoop performs some of the work upfront, sorting the data and performing streaming data manipulation. This is definitely not efficient for small gigabyte volumes of data. But when the data gets big - really big - like multiple terabytes and petabytes, then the search and data manipulation functions enjoy an order of magnitude performance improvement. The search and manipulation are enabled by the MapReduce algorithm.  MapReduce has been made famous by the Google implementation as well as the Aster Data implementation of it. Of course, Hadoop is open source. MapReduce takes a user defined mapping function and a user defined reduce function and performs key pair exchange, executing a process of grouping, reducing, and aggregation at a low level that you do not want to have to code yourself. Hence, the need for and value in a tool such as DAS. It generates the assembly level code required to answer business and data mining questions that business wants to ask of the data. In this regards, DAS functions rather like a Cognos or BusinessObjects front-end in that it presents a simple interface in comparison to all the work being done "under the hood". Clients who have to deal with a sea of data now have another option for boiling the ocean without getting steamed up over it.

Posted April 15, 2010 9:21 AM
Permalink | No Comments |

There are so many challenges that it is hard to know where to begin. For those providers (hospitals and large physician practices) that have already attained a basic degree of automation there is an obvious next step - performance improvement. For example, if an enterprise is operating eClinic Works (ECW) or similar run-your-provider EHR system, then it makes sense to take the next step and get one's hand on the actual levers and dials
that drive revenues and costs.

Hospitals (and physician practices) often do not understand their actual costs, so they are struggling to control and reduce the costs of providing care. They are unable to say with assurance what services are the most profitable, so they are unable to concentrate on increasing market share in those services. Often times when the billing system drives provider performance management, the data, which is adequate for collecting payments, is totally unsatisfactory for improving the cost-effective delivery of clinical services. If the billing system codes the admitting doctor as responsible for the revenue, and it is the attending physician or some other doctor who performs the surgery, then accurately tracking costs will be a hopeless data mess. The amount of revenue collected by the hospital may indeed be accurate overall; but the medical, clinical side of the house will have no idea how to manage the process or improve the actual delivery of medical procedures.

Thumbnail image for Thumbnail image for riverlogicjpg.JPG

Into this dynamic, enters River Logic's Integrated Delivery System (IDS) Planner ( The really innovative thing about the offering is that it models the causal relationship between activities,
resources, costs, revenues, and profits in the healthcare context. It takes what-if analyses to new levels, using its custom algorithms in the theory of constraints, delivering forecasts and analyses that show how to improve performance (i.e., revenue, as well as other key outcomes such as quality) based on the trade-offs between relevant system constraints. For example, at one hospital, the operating room was showing up as a constraint, limiting procedures and related revenues; however, careful examination of the data showed that the operating room was not being utilized between 1 PM and 3 PM. The  way to bust through this constraint was to charge less for the facility, thereby incenting physicians to use it at what was for them not an optimal time in comparison with golf or late lunches or siesta time. Of course, this is just an over-simplified tip of the iceberg.


IDS Planner enables physician-centric coordination, where costs, resources, and activities are tracked and assessed in terms of the workflow of the entire, integrated system. This creates a context of physician decision-making and its relationship to costs and revenues. Doctors appreciate the requirement to control costs, consistent with sustaining and improving quality, and they are eager to do so when shown the facts. When properly configured and implemented, IDS Planner delivers the facts. According to River Logic, this enabled the Institute for Musculosketal Health and Wellness at the Greenville Hospital System to improve profit  by more than $10M a year by identifying operational discrepancies, increase physician-generated revenue over $1,700 a month, and reduce accounts receivable by 62 down to 44 days (and still falling), which represents the top 1% of the industry.  Full disclosure: this success was made possible through a template approach with some upfront services that integrated the software with the upstream EHR system, solved rampant data quality issues, and obtained physician "buy in" by showing this constituency that the effort was win-win.

The underlying technology for IDS Planner is based on the Microsoft SQL Server (2008) database and Share Point for web-enabled information delivery.

In my opinion, there is no tool on the market today that does exactly what IDS Planner does in the areas of optimizing provider performance.River Logic's IDS Planner has marched ahead of the competition, including successfully getting the word out about its capabilities. The obvious question is for how long? The evidence is that this is a growth area based on the real and urgent needs of hospitals and large provider practices. There is no market unless there is competition; and an overview of the market indicates offerings
such as Mediware's InSight (, Dimensional Insight ( with a suite of the same name, Vantage Point HIS  ( once again with a product of the same name. It is easy to predict that sleeping giants such as Cognos (IBM) and Business Objects (SAP) and Hyperion (Oracle) are about to reposition the existing performance management capabilities of these products in the direction of healthcare providers. Microsoft is participating, though mostly from a data integration perspective (but that is another story), with its Amalga Life Science offering with a ProClarity frontend. It is a buyer talking point whether and how these offerings are able to furnish useable software algorithms that implement a robust approach to identifying and busting through performance constraints. In every case, all the usual disclaimers apply. Software is a proven method of improving productivity, but only if properly deployed and integrated into the enterprise so that professionals can work smarter. Finally, given market dynamics in this anemic economic recovery, for those end-user enterprises with budget, it is a buyer's market. Drive a hard bargain. Many sellers are hungry for it and are willing to go the extra mile in terms of extra training, services, or payment terms.

Posted April 5, 2010 11:33 AM
Permalink | No Comments |

Just as it is often true that some generals prepare to fight the last war, so too it is also the case that certification is getting ready to validate last generation's client-server technology. The famous example is the construction after World War I of the Maginot Line by France, a series of trench like fortifications in anticipation of trench warfare. Equally famous is (German) General Heinz Guderian's World War II blitzkrieg that went around (and over) the wall using tanks and aircraft, rendering the Maginot line obsolete. Innovations in cloud computing, the provisioning of web based EHRs (EMRs) and database appliances are the equivalent of an emerging technology blitzkrieg in HIT. While we should not represent different players in the market as enemies, large and established HIT vendors (see separately published research here for an overview) are likely to confront a looming innovator's dilemma.[1] In the 1980s IBM would not look at any sales under $1 million dollars and was nearly driven out of business by upstarts building systems with a copy of SQL Server and a copy of PowerBuilder priced one tenth the cost. The dominant HIS and PPMS vendors are positioning themselves to be the next candidates for the unwelcome category of "too big to fail," though this time in HIT, not banking. The recommendation? Stop them before they hurt themselves. Stop them by deleting the word 'certified'. 


The prospective buyers of EHR technology are physician practices (eligible professionals (EP)) and hospitals. They are best protected from the unprofessional practices of a small - very small - minority of system sales people through their own professional procurement practices, diligently reviewing written proposals, specifications, and contracts. It is unlikely to be the basis of a legal claim against a software vendor installed at a given site that failed to satisfy meaningful use criteria for provider that the software system or modules were certified. The provider receives the reimbursement (if any is forthcoming) and is responsible for attaining and demonstrating a use of the system that delivers healthcare services in a more effective way according to the criteria of meaningful use. From such a perspective, certification may be of value to the vendor but it furnishes a false sense of security to the prospective buyers (such as hospitals and eligible physicians).


On the one hand, some of the meaningful use criteria, including those that capture basic clinical transactions, are best delivered by an underlying system consisting of a standard relational database, a frontend query and reporting tool, and a data model that represents the patient demographics, diagnoses and procedures, and related clinical nomenclature. The user interface at which the meaningful use of the technology, including the capturing of data electronically, reporting of data electronically, and the manipulation of quality measures is the contract at which the meaningful use is delivered. Multimillion dollar HIS systems are candidates for "over kill" from a transactional perspective in addressing such use yet at the same time lack the necessary infrastructure to support quality metrics through aggregation and inquiry via a data warehouse function. 


On the other hand, some of the meaningful use criteria - in particular the requirement to report about 100 quality measures - are poorly accommodated by the vast majority of HIS system on the market today. I do not know of a single exception where the HIS system provides for the aggregation and analysis of metrics in a way similar to what business intelligence (now 'clinical intelligence') does in business verticals such as retail, finance, telecommunications. Once again, the user interface at which the quality measures are surfaced is (in effect) the contract, but it will prove impractical to generate such a result. If these systems propose to perform the aggregations of a year's worth of data based on transactional detail, the predictable result will be a long, slow process. Reinventing the wheel is hard work, but that is the path on which existing HIS and PPMS (EHR) systems have embarked. I hasten to add that the quality metrics are critical path and required. However, the path to the efficient and effective production of them does not lie through certification.


Those providers that are currently reporting quality metrics from a data warehouse that gets its data from the transactional EHR are concerned that they are out of compliance. It is widely reported that Intermountain Healthcare is one such enterprise.[2] Are they now supposed to certify their data warehouse? Where is the value-added in that?


A long list of clinical quality measure requires reporting a percentage of a total aggregate of a given diagnostic condition; for example, percentage of patients aged 50 years and older who received an influenza immunization during the flu season (PQR1 110 / NQF 0041). These are what other business operations such as retail or finance call "business intelligence" questions. In the context of healthcare, we might call them "clinical intelligence" - or just plain quality measures - but the function is similar - to make possible the tracking of improvements in the delivery of care. To manage enhancements by measuring outcomes in aggregate. The periodic calculation of some 94 percentages requires scanning the entire database and performing an analysis, review, and aggregation of a totality of the diagnostic data. Even though most records will not satisfy a given diagnosis - whether coronary artery disease of diabetes mellitus - they will have to be examined. The lessons of some two decades of computing in finance, telecommunications, retail, manufacturing, and fast food are clear, but not always  obvious to those medical professional who have spent their time in healthcare IT. When you attempt to perform business intelligence (BI) against the operational, transactional system, then something has got to give. Either the performance of the transactional system is brought to its knees or the BI process has to wait. In fact, system design for high performance transactions processing are poorly adapted to generate the results required to perform business intelligence. The difference is between update intensive operations and query intensive ones, between updating and scanning. That is why the data warehouse was invented - in order to collect and aggregate the data required for reporting in optimal format, allowing the transactional system to do its job supporting clinical process whereas clinical intelligence is used to guide process improvements.


The counter-argument that certification of the functionality around clinical data warehousing ought to be rolled up into the certification process is the reduction to absurdity of the process itself. There will always be some such functionality - if not data warehousing, then cloud computing, database appliances, artificial intelligence in clinical decision support, and so on. The list is limited only by our imagination and that of our fellow innovators. The Meaningful Use Proposal has got it basically right (though one can (and should) argue about details). The reporting of quality measures, based on aggregated clinical detail is a proven method in other disciplines of driving improvements in outcomes by surfacing, managing, and reducing variability from the quality norm. ('If you can't measure it, you can't manage it') However, this value is provided by implementing systems that directly address the superset of meaningful use criteria captured in the proposal, not by driving the process up stream into one particular system architecture that happens to have emerged in the early 1990s and gotten some significant traction.

To see the full Comment submitted to on the rule-making for reimburseable EHR investments go here: Certification _Lou Agosta's Feedback to Centers for Medicare and Medicaid Services, HHS.pdf

[1] Clayton Christensen, The Innovator's Dilemma. Boston: HBS press, 1999. Professor Christensen tells the story of how successful companies - across incredibly diverse and different industries - are single-mindedly motivated to listen only to their biggest and best customers (in general, not a bad thing to do), and are, thereby, blind to innovations that get a foothold at a price point down market and subsequently disrupt the standard market dynamics, leading to the demise of the 'big guys.' Note that IBM, mentioned in the next sentence, is one of the few companies to have succeeded in turning itself around, admittedly in a painful and costly process led by the now legendary Lou Gerstner, and to have brought itself back from the brink. Watch for this dynamic to continue in HIT, albeit at a slower pace due to the friction caused by large government participation and with results such as that at IBM being the rare exception.

[2] For example -

Posted February 18, 2010 7:07 AM
Permalink | No Comments |

Healthcare providers are in the business of fighting disease, not installing and maintaining software. Still, software is now a part of every business process; and it is on the horizon to become a part of the every process of diagnosing, managing, and treating diseases that occurs in a modern first world setting. The healthcare provider (whether eligible physician (EP) or hospital) has to demonstrate meaningful use. That burden cannot be shifted onto a certifying organization. Nor would such certifying entity be willing to warrant that its certified EHR system is used in a meaningful way by the provider merely based on its test scripts. Whose job is it to demonstrate that meaningful use is occurring? It is the provider's job. 


The proverbial buck stops with the EP and hospital. The ONC Certification document cautions: 'Eligible professionals and eligible hospitals that elect to adopt and implement certified EHR Modules should take care to ensure that the certified EHR Modules they select are interoperable and can properly perform in their expected operational environment.' All the usual disclaimers apply - the certified product is not warranted for any particular purpose and your mileage may vary. In other words, the adoption of a 'certified' EHR (EMR) does not create a safe harbor for a provider who purchases, installs, and uses one. Nor is the recommendation of this post that such a safe harbor be implemented, based on the requirement for provider accountability and the needs to improve quality and efficiency. The provider is still responsible for outcomes, demonstrating quality metrics, and showing that the provider makes meaningful use of whatever technology is acquired. Presumably the 'certification' is supposed to provide the provider with guidance (in the market) and 'a leg up' on getting over the bar to reimbursement and quality improvement. However, the result is to privilege a certain class of large, existing, dominant players in the market and to complicate the EHR technology acquisition process.


In general, the criteria of meaningful use represent a contract - an agreement - to which the underlying system (whether certified or not) has to conform in terms of producing usable, interoperable output. The provider will work backwards from the interface to assure that the underlying functionality conforms to its attestation - the provider's sworn testimony ('attestation') to the reimbursement office - that the system functions as required. By applying the principles of object-oriented system design and implementation to the challenge represented by meaningful use, the diligent and conscientious provider eliminate the extra step of certification.


The situation is analogous to that encountered in the so-called ERP revolution (enterprise resource planning) in computing. In the run up to the Y2K (year 2000) event in the year 2000, companies such as SAP, Oracle, JD Edwards, PeopleSoft, and many other ERP systems insisted their architectures were robust and flexible enough to support both transaction processing and BI from the same system architecture and data stores. Fast forward a few years, and by the year 2004 those of these companies that still survived were announcing and delivering revised architectures to support business intelligence. We are on track to repeat an entirely analogous repetition of architectural trial and error. Companies such as Cerner, Epic, Eclipsys, McKesson, GE Centricity, MedSphere, Meditech, and so on predictably claim that their transactional engines based in Mumps and Cache (those are data stores) can handle the job. But the job is complex and is inherently at odds with itself. It is a tad like building a car that gets both good gas mileage and one that has a lot of horsepower. In the software world, the solution is to feed the transactional data into a data warehouse or other business intelligence subsystem and use the warehouse to optimize the transactional one. (The automotive problem has still not been solved, nor will it be in this post.)


However, what about the innocent one to five physician practices that deliver much of the healthcare in the country? Who is going to prevent these practices, smart but totally naĂ¯ve from an information technology perspective, from buying a 'pig in a poke'?


First, the need for an assistance program to support such practices with the IT process is well taken. The Regional Extension Centers, where such practices can go to get a leg up on the process, are a step in the right direction - when they actually come into existence. There is also merit is something like a 'good house keeping seal of approval' that interoperability standards are actually implemented by the software in question. The availability of 'Geek Squad Class' professional services at an entry level price point using graduates of community college-based HIT programs also belongs on the list - see more details here. However, none of these mandates certification of the EHR software, the latter adding an unprecedented layer of overhead to an already complex process.


If a small physician practice is savvy enough to know its requirements and buy out of the box or configure a custom solution, it makes no sense to require such a practice to limit its choices to those approved by a certification authority, on penalty of being excluded from the reimbursement process. For those that are the opposite of savvy - and require additional support - the certification process will be as misleading as it is incomplete. The practices are still required to demonstrate meaningful use whether or not the software is certified to provide certain functionality that allegedly lines up with these criteria of meaningful use.


There is some controversy about whether small practices will be diligent in interviewing several competing vendors and undertaking at least a mini-RFP process prior to paying some $44K for software. Given that such physicians are alone responsible for demonstrating meaningful use, regardless of the certification status, the recommendation is to err on the side of caution and be a diligent buyer. Such a buyer will require the vendor to explain in detail, step-by-step how its software will address the requirements for reimbursement. That is a de facto, virtual RFP. The push back from small physician practices is looming. How can such small players be expected to acquire a system on their own? Better to partner with a regional hospital who operates an EHR? Of course, the rules and potential loss of independence for doing so are [sometimes] substantial. So here is the rock and here is the hard place. Is there value in buying from a list of 'certified' EHR providers? Yes, but ... The 'but..." is that such a list will not in itself demonstrate "meaningful use". The physician and her or his practice are alone responsible for that.

Update: To see the full Comment submitted to on the rule-making for reimburseable EHR investments go here: Certification _Lou Agosta's Feedback to Centers for Medicare and Medicaid Services, HHS.pdf  

Posted February 17, 2010 7:58 AM
Permalink | No Comments |
PREV 1 2 3


Search this blog
Categories ›
Archives ›
Recent Entries ›