|The Health Record Review
by Jeff Rowe, Editor
Posted on Wed, Aug 15, 2012 - 09:28 am
Electronic health records improve patient care, except when they don’t.
That doesn’t seem an unreasonable takeaway from a recent study conducted by Dartmouth University researchers. To be fair, though, more than one takeaway may be in order.
According to reports, the study, “Meaningful Use of Electronic Health Record Systems and Process Quality of Care: Evidence from a Panel Data Analysis of U.S. Acute-Care Hospitals”, found that “hospitals with primitive or limited IT that upgraded to an EHR system satisfying stage 1 meaningful use objectives saw a significant improvement in healthcare delivery.”
As Eric Johnson, director of the center for digital strategies at Dartmouth’s Tuck School of Business, put it, “Many of the hospitals that showed improvements were the ones struggling with quality to begin with.”
The study looked specifically at the quality of processes for treating heart attack, heart failure, pneumonia and surgical care infection prevention, and the research team was nothing if not thorough. They “analyzed an extensive dataset of 3,921 hospitals over a five-year time period from 2006 to 2010, . . . conducted 16,650 hospital observations and analyzed data on EHR systems, which came from 2005 to 2009 HIMSS Analytics Databases.”
But here’s the rub: While hospitals relatively new to EHRs showed improvement, the study also found “that advanced hospitals with superior healthcare delivery — hospitals transitioning from Level 3 to Level 4 EHR systems — actually saw their quality compliance go down."
Johnson seems inclined to attribute that drop to “a backlash from healthcare professionals in using complicated software systems,” but perhaps other reasons apply as well. For example, we’ve had health IT professionals suggest to us that once the “new-ness” of the IT transition wears off, it becomes more of a challenge to keep people focused on continual improvement.
But another possibility may be related to a comment one of our readers made just yesterday. Responding to our post about a doctor who has laid out his own EHR wish list, the reader suggested that there’s a difference between “micro” and “macro” applications of health IT.
“Average practicing doctors,” the reader wrote, “are currently mostly interested in the former--helping one patient at a time. However, the big gains from EHRs/HIEs and the constellation of associated acronyms will come when we can start to use the data these systems generate to do real population level health management. That's the ‘macro’ application--also often referred to as ‘big data.’”
So here’s the question: Could it be that IT pros at the advanced hospitals are ready to start taking the “big picture” approach that comes with population level health management?
Photo courtesy of anemoneprojectors via Creative Commons