InContext Magazine

your source for solving the unstructured information challenge

Welcome, Jennifer | Log Out

Perceptions

Sandra Lillie

The Path Forward for Enterprise Imaging

Healthcare

During HIMSS 17, a meet-up of the HIMSS/SIIM Enterprise Imaging Workgroup was held with brief updates from a number of the working groups. These groups are focused on important areas to advance unlocking imaging to our healthcare enterprises, including:

  • Articulating the value proposition of Enterprise Imaging to stakeholders across the organization;
  • interoperability and the ONC, and
  • developing a Maturity Model for Enterprise Imaging

Let's discuss this last one for a bit. Implementing a comprehensive Enterprise Imaging strategy across an organization can appear a daunting task given the broad distribution of images across healthcare organizations. We heard recently from a provider that conducted an assessment to locate all their medical images (including photos and video) and they discovered them in 80 separate areas of the organization on all types of media.  Again, feels daunting, doesn’t it?

While harder to recall it now, perhaps, but the same was true when we first encountered the HIMSS Stage 7 EMR Adoption Model.  Initially, it was only the larger institutions and academic medical centers that seemed to achieve Stage 7 certification. Today, over a short period of time.  Healthcare organizations of varying sizes and differing EMR vendors are achieving this certification. In fact, a vast majority of organizations have achieved Stage 5 certification or higher. What’s the secret?  And, what can we learn from it?

First, the HIMSS stages began with the lowest common denominator as a baseline. The work was done to articulate practical progressions that also aligned with increasing clinical value.  And finally, the stages were adaptive enough to consider technology advances.  For example, data analytics in Stage 7 is sufficiently broad enough to consider things like population health, analytics in support of genomics and the like.

The working group on Enterprise Imaging’s Maturity Model appears to follow a similar mindset. It must articulate a common baseline with a progression that allows each organization to plan their roadmap in practical stages that tie achievable benefits to economic benefits, improved patient outcomes and quality. Furthermore, it must be adaptive to the ever-increasing innovation occurring in medical imaging, such as digital pathology, genomics, patient-generated images, machine learning and augmented intelligence. 

Together, and with a framework like the Maturity Model, healthcare will quickly advance through the stages for creating a comprehensive, patient-centered Enterprise Imaging foundation.

It is important work and I encourage everyone invested in medical imaging to get involved and monitor its progress.

Stephanie Eaton

Reinventing the Wheel, the Right Way

Education

Just a few years ago, Bloomsburg University, one of 14 universities in the Pennsylvania State System of Higher education, was busy reinventing the wheel every time they received a new transfer student transcript. The admissions office was committed to using the course articulation database in PeopleSoft Campus Solutions to create course transfer equivalency rules, but the process required an admissions counselor to manually type in every course from every transcript...even if it was the same courses and same institutions as the last 100 transcripts processed.

With 18 percent of the University’s 9,650 students coming in as transfer students, that’s a lot of wheel-reinvention.

The department also employed two processors who scanned transcripts and updated student pages in PeopleSoft. The entire process was incredibly manual, and the turnaround time for getting a transfer credit evaluation report back to the student applicant was a lengthy four to six weeks.

Opportunity: Leveraging automation to accelerate transcript evaluation time

Transfer students are some schools’ best-kept secret. They’ve already applied to and succeeded at another institution, so the retention rate is often higher than in the freshman class, as several studies show. Making transfer students a high priority in the admissions and records offices is an initiative with high ROI and ongoing benefits both for individual students and for your school’s retention and graduation rates.

That’s exactly the initiative Bloomsburg University undertook when deciding to automate their laboriously manual transcript process. Their goal was to continue to leverage their PeopleSoft Campus Solutions investment with course articulation rules, continue to receive both paper and electronic transcripts, automatically image transcripts into the system and capture transcript data without templates.

“Without templates” deserves an explanation. Since Bloomsburg University accepts transcripts from 1,300 feeder schools, they were looking at building around 2,000 transcript templates (many schools have different online and paper versions) and employing someone to continue to maintain the templates.

Instead, Bloomsburg University opted to deploy a template-free solution from Lexmark Enterprise Software. With template-free data extraction from paper or electronic college and military transcripts, ICT could be deployed quickly (in just four months in this case) without the ongoing expense of template maintenance.

Transcript process transformation

With ICT, the workflow in admissions and the student experience has been transformed. Transcripts are received electronically or via paper, put through ICT’s Classify-Extract-Verify process, and are ready for the admissions counselor to quickly review with course equivalency data from PeopleSoft already updated.

The new transcript turnaround time at Bloomsburg University is advertised as just two weeks, and decisions can often be made in a few days if the application is complete. With the ability to make better decisions faster, admissions counselors can spend more time recruiting and on other high-value projects rather than the mind-numbing data entry of the old days.

3 take-aways from the Bloomsburg transformation team

Thinking of deploying your own “transcripts transformation”? Bloomsburg University has three great tips for success:

  1. Start to move from paper transcripts to electronic transcripts. Electronic transcripts provide cleaner data; Bloomsburg University currently has about a 50/50 mix and is working with their feeder schools to moving to a higher percentage of electronic.

  2. Evaluate current processes and adopt best practices. Bloomsburg University didn’t have to evaluate their current process—it was obviously painful and time-consuming. But they did have an opportunity to look to the future and choose a solution that best fit their vision of a streamlined back office that supported excellent transfer student service.

  3. Support from upper administration is key. Data entry is a pain point for the person performing it, but not necessarily to management—unless that data entry is creating unnecessary expenses or roadblocks to meeting high-level goals in recruitment, enrollment and retention. Bloomsburg University’s transcript automation initiative originated with the admissions director, so they were fortunate to have full support from management.


Learn more about Kofax Perceptive Intelligent Capture for Transcripts and Intelligent Capture for Transcripts Plus, the industry’s first high school coursework extraction engine.

Ken Congdon

Who's Afraid Of The Big Bad Cloud?

Healthcare

The healthcare industry has long kept cloud technology at arm’s length. The cloud simply was not to be trusted. The idea of relinquishing control of valuable patient data from internal server farms to unfamiliar hosted environments was enough to cause palpitations in the hearts of many healthcare CIOs. What if our data is lost or exposed? Will my clinical teams and applications be able to access our data when they need it? How can I be sure a cloud application is HIPAA compliant? These are the types of questions that made moving to the cloud seem way too risky. However, if HIMSS 17 is any indication, this mindset has begun to change in a big way.

Several sessions at HIMSS 17 illustrated that the cloud is not only gaining acceptance in healthcare, it is largely being viewed as a game-changing option vital to reducing costs and improving clinical outcomes. For example, during a session titled Still Afraid Of The Cloud? A CIO’s Journey, Deanna L. Wise, Executive Vice President and CIO at Dignity Health, shared her story of how she evolved from a cloud skeptic to an advocate. During her presentation, she referenced data from an eye-opening HIMSS 2016 Cloud Survey that showed that 84% of provider organizations are currently using the cloud in one way or another. Not surprisingly, the most common cloud use was to host analytics, finance, operations, HR and other data not crucial to clinical care (78%). However, the survey also showed that primary data storage (68%) and clinical application hosting (54%) were gaining significant traction from a cloud perspective. 

A slide during Wise’s HIMSS 17 presentation showing the most common uses of the cloud according to a 2016 HIMSS survey.

Another interesting data point from the HIMSS 2016 Cloud Survey was that “Increasing Performance and Reliability” was the most common response given for healthcare providers adopting cloud technologies. This bucks the historical trend of “Lowering TCO” or “Ease of Management” being the primary reason for moving to the cloud.

Wise admitted gaining a comfort level with the cloud didn’t happen overnight and may not be right for every provider or every application. She stressed that it was a journey. In her case, she gained confidence in the cloud by starting small to build trust in the platform and the provider and ensure the technology performed as expected. She then gradually moved more applications to the cloud and stresses that she envisions ultimately relying on more than one cloud provider.

The HIMSS 17 Cloud Computing Forum also provided strong evidence of the shift to cloud platforms in healthcare. During this assembly, John Houston, Vice President of Information Security and Privacy at University of Pittsburgh Medical Center said that while 90 percent of the health system’s software and 75 percent of its applications are legacy on-premise, only 20 percent of its current contract negotiations are for on-premise solutions. The remaining 80 percent are for cloud-based platforms.

Other providers at the Cloud Computing Forum showed how the switch to the cloud has been truly transformational for their organizations and patients. For example, Richard Stroup, Director of Informatics at Children’s Mercy Hospital in Kansas City shared a story of how the cloud is helping the provider save lives of at-risk infants. Using a cloud-based tablet app, the provider is monitoring infant patients post-surgery after they leave the hospital. In one test case, Children’s Mercy monitored 68 patients that had completed second stage surgery using the app. Thanks to real-time monitoring, there was no interstage mortality. Stroup said that historically they would have expected six to 12 mortalities in that same timeframe without cloud-based monitoring.

The move to the cloud will only continue to increase in healthcare – not only because of the performance, TCO and maintenance benefits cloud technologies offer, but largely out of necessity. With mass EHR adoption and other digitization efforts, the amount of healthcare data is growing exponentially. The rise of genomics, 3D imaging and other data-intensive healthcare applications will push storage and processing requirements to new limits. In fact, IDC states that by 2020, the volume of healthcare data will surpass 2,300 exabytes. Managing this type of data load onsite at healthcare facilities is going to be impractical for most, and impossible for some. Cloud platforms will need to be leveraged to store this critical information and keep patient-centered care initiatives on track.

Ken Congdon

You Can’t Be Patient-Centered Without Complete Patient Data

Healthcare

I, like most of the health IT world, am attending the HIMSS Conference and Exhibition this week in Orlando. Thus far, I’ve been encouraged to see an increased emphasis and focus being placed on the patient at the event this year. Precision medicine and patient-centered care have been central themes in educational sessions as well as on the exhibit floor.

For example, in a session titled Making Health IT Patient-Centered, Tina Esposito, Vice President of the Center for Health Information Services at Advocate Health Care outlined the four-step process the provider is taking to make their health IT systems more patient-centered. What does this mean? Well, for Esposito, patient-centered health IT supports the “Accountable Care Mind Shift” that is occurring not only at Advocate, but in health systems across the country and around the world. This mind shift represents a transition from siloed, episodic care management to population health and value-driven coordinated care.

A slide from Esposito’s HIMSS presentation outlining the Accountable Care Mind Shift.

Advocate’s four-step process is geared toward creating a technology infrastructure that enables specific and comprehensive patient information to be collected and analyzed, so that tailored treatment plans can be administered to each individual. At the highest level, the four steps are as follows:

Step 1: Connect disparate data sources

Step 2: Link sources to a patient (creating a Master Patient Index)

Step 3: Leverage data analytically

Step 4: Engage patients

Most of Esposito’s presentation focused on how Advocate is using analytics to identify and prevent potential illness complications (e.g. referencing key data points to predict when an asthma patient may experience an acute attack and intervening early to prevent a hospital visit). This was all fascinating, but I kept coming back to Step 1 in Advocate’s plan – connecting disparate data sources.

Delivering true patient-centered care requires comprehensive access to all patient data and content. Without it, any patient-centered initiative will fall short. You can’t provide patient-centered care if your information about that patient is incomplete or you don’t have a holistic view of the patient’s medical history. This is important to understand because, quite frankly, most hospitals and health systems don’t have this type of visibility.

So much of the past decade has been focused on EHR adoption and use, positioning this clinical system as the single source for patient information. The problem is, the vast majority of patient data is unstructured (80% according to Gartner) and lives in multiple systems outside and unconnected to the EHR. This type of content includes assets like DICOM medical images (e.g. X-rays, CT scans, MRIs, etc.), non-DICOM medical images (e.g. dermatology and wound photos, endoscopy images/video, surgery video, etc.), clinical documents and more. Plus, new types of patient data are now being generated outside of the provider setting that can have a significant impact on precision medicine and patient health (e.g. genomic data, patient monitoring data, etc.)

Patient-centered care will require a health IT infrastructure capable of ingesting and connecting all of these data sources to the patient through the core clinical systems physicians use every day. It’s time we started to think about the management of unstructured content in the same way we think about the management of EHR-based discrete patient data – at the enterprise level.

A new white paper provides a roadmap for the enterprise-wide management of unstructured patient data and labels the practice Healthcare Content Management (HCM). The concept basically merges Enterprise Imaging and Enterprise Content Management (ECM) technologies into a single, fully integrated infrastructure that makes this content accessible through core clinical systems such as EHRs and well as other enterprise systems. This type of approach is going to prove to be more important as providers move to value-based, patient-centered care.

At HIMSS 17 it’s clear that some providers are beginning to utilize analytics and collaboration to develop treatment plans based on the uniqueness of the individual. However, enabling this type of care all starts by collecting and connecting all types of content both inside and outside the hospital setting, giving physicians access to a complete and real-time view of the patient. When this is done successfully, it can lead to real change and ultimately better outcomes at lower costs.

Ken Congdon

Enterprise Imaging: Supporting today’s health IT demands

Healthcare

Healthcare is on a mission to improve patient outcomes and better manage costs. The shift from volume to value, with its reliance on greater care coordination and collaboration, has given rise to new industry IT demands for centralization, standardization, interoperability and data integrity.

Enterprise Imaging makes it easier for healthcare organizations to meet these demands by eliminating the inefficiency, complexity and roadblocks that prevent easy access to the medical image content needed to drive more informed care decisions and improve outcomes.

Centralized storage, standardized management

Healthcare has historically been a decentralized industry, with departments operating independently of each other and with their own IT systems, applications and formats. As a result, vital medical images often get trapped in departmental silos that are disconnected from core clinical systems and don’t make their way into electronic health records (EHRs). When this occurs, these images aren’t available for physicians at the point of care where crucial treatment decisions are made.

A key piece of technology in an Enterprise Imaging strategy is the vendor neutral archive (VNA). The VNA provides centralized storage and standardized management of all medical content and images, regardless of their origin, native format (DICOM, XDS, JPG, MPG, etc.) or vendor orientation and makes this information available across multiple systems, departments and enterprises. As a result, the VNA provides an essential foundation for delivering a comprehensive image-enabled view of the patient that’s centralized, easily accessible and better supports care decisions.

Sharing imaging data

A recent survey by CHIME found that 54 percent of organizations cannot exchange medical images with recipients outside their four walls. With the emergence of initiatives like Meaningful Use, the Affordable Care Act (ACA), value-based reimbursement and Accountable Care Organizations (ACOs), healthcare organizations must put a premium on interoperability.

When medical images are locked in isolated archives, interoperability, information sharing and image-enabling the patient record is a challenge. Implementing an enterprise viewer that allows viewing of any medical image, imaging report and related patient data anytime and anywhere is a vital step in the Enterprise Imaging journey. With an enterprise viewer, digital image access is no longer confined to the department that created the data. This platform empowers physicians to view any image along with patient content in any format across the enterprise. Such a viewing solution may replace or coexist with a traditional PACS viewer and may be integrated with a VNA or EHR.

Data Integrity

The volume of healthcare data is skyrocketing and expected to reach 15 zetabytes by 2020, which is equivalent to four times the amount of information contained on the Internet. Managing all this data is challenging because nearly 80 percent of it is unstructured data that resides outside of the EHR. Moreover, nearly 75 percent of all medical images today are non-DICOM (e.g. mobile phone images, video clips and images created by isolated modality stations) and therefore aren’t well managed by a traditional PACS. Organizations need an image capture and acquisition solution that works with all departmental systems and with DICOM and non-DICOM interfaces.

A sound Enterprise Imaging solution supports data integrity by performing synchronized updating of metadata (patient/study-level changes) through a journalized approach within the actual image data. It also should allow for automated disaster recovery should the database become unavailable as the database and image content sync. The journal can be used as a historical audit of when, what and how metadata content was changed. Make sure the VNA also has a duplicate-handling process, including the ability to manage duplicates based on your needs. One of the following techniques is required: keep all, keep first, keep last, keep by DICOM tag or keep by CRC and pixel validation.

To be successful in today’s patient-centered environment, hospitals and health systems need to find new ways to improve care quality, enhance the patient experience and lower costs. An Enterprise Imaging strategy provides an excellent opportunity for organizations to work towards those goals. By adopting Enterprise Imaging, hospitals and health systems can ensure medical images get into the hands of physicians and other clinical stakeholders for more informed decision making. As a result, organizations can make significant strides towards improving care quality, enhancing the patient experience and reducing costs. 

To learn more about how Enterprise Imaging can benefit patient care and financial operations, download the eBook – Enterprise Imaging: See What You’ve Been Missing.

Sandra Lillie

Building a business case for Enterprise Imaging

Healthcare

You understand that achieving true patient-centered care and improving outcomes requires that all clinical stakeholders have timely access to more imaging content across care settings. For organizations like yours to thrive, you need visibility into the breadth of medical images contained in picture archiving and communications systems (PACS) and other specialty legacy archives that store non-DICOM medical images and video. You’ve decided that Enterprise Imaging is the way to go, but how do you achieve organizational buy-in and more importantly sell it to your board as well?

Here are three key proof points you’ll need to present in order to demonstrate the benefits and secure the approved budget that is necessary for the initiative:

1. Improves clinical outcomes

With shrinking margins and reimbursement dollars on the line, leaders need to know that your idea will improve the hospital’s ability to care for patients. You’ll want to explain how Enterprise Imaging enables more informed clinical decision-making by allowing healthcare organizations to connect, manage and view medical images both at the clinical point of care and within departments such as radiology, cardiology, ultrasound, surgery and more. Consolidating imaging information throughout the enterprise into a single, standards-based repository can significantly improve patient outcomes by providing a holistic view of patient records, helping reduce potential readmissions and discharge delays, and streamlining workflows for better point-of-care interactions. In addition, online collaboration, including the ability to analyze and share measurements and notes, helps radiologists and other specialists communicate and work more effectively with referring physicians.

2. Enhances security and compliance

With new reports indicating that cyberattacks are on the rise and hackers are increasingly targeting healthcare, privacy and cybersecurity are issues that need to be addressed at the board level. In 2015, one in three Americans were victims of PHI breaches. The Office of Civil Rights (OCR) reported a combined loss of over 112 million records. Moreover, among the top 10 breaches, the vast majority – nearly 90 percent – were due to hacking. Breaches not only damage an organization’s reputation, but with the average cost per lost or stolen record placed at $363, they can also have significant financial impacts.

You will want to explain to your board that an Enterprise Imaging solution that is truly vendor neutral, standards-based and adheres to DICOM and HL7 as well as IHE and XDS framework, ensures data security and HIPAA compliance by getting images off hard drives, disks and USB drives. Further, it establishes centralized control of imaging data, making it easier for IT to manage and secure information, applies security protocols to all images and protects PHI from unwanted exposure.

3. Optimizes new business arrangements

As the hospital consolidation market continues to grow, and boards of directors evaluate new opportunities, they’ll want to know how imaging technology can help them successfully integrate and capitalize on their investments.  

With an Enterprise Imaging strategy in place, organizations have better insight into how to more effectively monitor utilization of services over time and plan service line expansion or retraction. The board will want to know that such an approach supports and optimizes new business arrangements by:

  • Making valuable clinical imaging content available to providers at the right time and in the right way to deliver positive outcomes for patients
  • Consolidating and economizing storage so new hospitals and partners can easily integrate into existing networks and gain access to systems
  • Bringing vital and comprehensive patient information to the  care team by aggregating and integrating studies directly into the patient’s record in the EHR
  • Supporting the divestiture of a facility from the hospital organization through true ownership of the images

Armed with a solid business case, your board will better understand and justify how an investment in Enterprise Imaging can help the organization grow and thrive, enabling it to meet the immediate demands of the enterprise as well as new value-based care models and future business partnerships.

Grant Johnson

The ideal leader profile isn’t what it used to be. Company leaders can no longer sit alone in their offices, far removed from the staff who do the work to keep business humming. Now, leadership at all levels is expected to be more engaged, involved and approachable. Walking the walk is now even more important than just talking the talk. These hands-on traits are especially important as you’re faced with leading a team or an entire organization through digital transformation.

A Harvard Business Review report, Driving Digital Transformation: New Skills for Leaders, New Role for the CIO, found businesses that qualify as digital leaders are more likely than those trailing them to have:

  • Revenue growth over 10%
  • Profit margins that are greater than the industry average
  • A CEO who understands digital opportunities and threats
  • A CIO who is a digital master or digital coach
  • A clearly defined digital vision and strategy
  • Digitally proficient leaders at multiple levels

But what sorts of traits do these aforementioned “digitally proficient leaders” have? It’s one thing to qualify a CEO as “understanding digital opportunities” or a CIO as a “digital master,” but how do those traits translate into a successful digital strategy, and how do other leaders across the organization stand out and succeed in a quickly-changing world?

If you’re looking for ways to get in front of the digital revolution and make yourself irreplaceable as your company digitizes, congratulations –you’re already positioning yourself well to be a digital leader. After all, waiting around for digital transformation to happen to your organization is one of the surest ways to be left behind.

Beyond recognizing that your company needs to start preparing for digital transformation now, here are five other traits you can begin cultivating to make yourself an irreplaceable leader during the process:

1.       Practice agility and flexibility, and learn to recognize it in others. If you’re used to relying on your job description and a static set of skills, you need to shift your mindset. Sure, experience as a CMO and apparent wizardry at Excel pivot tables are important, but with technology, analytics and datasets changing so quickly, it’s even more important to be agile. Become someone who jumps at the chance to learn a new skill or try a new way of doing things. When you’re hiring, look for these traits in potential employees, as well. In the age of digital transformation, quick, enthusiastic, adaptable learners are priceless.

2.       Don’t be afraid to fail, and encourage risk-taking in your team. One major benefit of digitization is that it lends itself well to rapid iteration and trying new things. For example, if you begin building your mobile app one way and realize right after launch that your customers would prefer different features, you’ve lost little time and gained valuable insight to what works.

3.       Listen at scale and provide feedback. It’s impossible to please everyone all the time – a lesson that applies to both your team and your customer base. But listen to the comments, concerns and ideas of both groups – if you begin to hear any commonalities in their opinions, take them seriously. Consider what it would mean to implement practices that your employees and customers are both asking for. Then, let them know they’ve been heard. Even if the change they’re asking for isn’t possible, it’s important to let them know their leaders are listening.

4.       Break down departmental barriers. When you hire agile, flexible, innovative thinkers, they don’t want to work in silos. They want to bounce their ideas off coworkers who can help make them reality. These notions aren’t folly, and they won’t discourage productivity. Instead, allowing your employees to test out new ideas can help keep them engaged at work and lead to some brilliant outcomes. Encouraging collaboration and invention among your people is a hallmark of an invaluable digital leader.

5.       Focus on customer experience, and learn to see the end rather than the means. It’s easy be consumed meeting all your quarterly target numbers, such as leads, web traffic, click-thrus and conversion rates. While these metrics matter very much – they’re how you measure progress, profitability and goal attainment – they aren’t the entire end game. No matter what your organization does, your ultimate success depends on customer experience., from first content, to ongoing engagement. Keep your focus on the customer, allocate appropriate resources, and make sure your teams are doing the same.  

Sarah Bajek

Build your purchase-to-pay outperformance entourage

Accounting and finance

You’ve done the research, you ran the numbers, and you know that your accounts payable and purchasing departments are capable of outperforming expectations through stronger collaboration and the right purchase-to-pay automation technology. Now it’s time to implement the changes and find the solutions to make it happen – but you can’t do it alone.

Meet the IT committee

If your organization is like most medium to large-size companies, the decision-making process and selection of your P2P automation technology will be handled by a cross-functional committee made up of a diverse mix of departments and backgrounds. Of course, you’re leading the charge, IT will play a big role, and you’ll probably need a green light from executive management. But who else do you need on your side?

According to research from LinkedIn®, technology decision making goes beyond the senior IT department. In fact, they found that while IT is heavily involved throughout the entire project, 78% of the IT committee works outside of IT.

Most IT committees include these members:

  • Application managers are typically leaders of business units for other programs that will need to work seamlessly with your P2P technology, such as your ERP or CRM systems.
  • Enterprise architects are responsible for running servers and other foundational elements of the enterprise.
  • Project managers keep the project on track and manage the implementation.
  • Other department managers act as the intermediaries between technical staff and senior leadership.

Their goals

Your IT committee is there to evaluate the cost and scope of this project with varying interests. While evaluating vendors and solutions, you’ll focus on optimizing your P2P processes. IT will dig into the technology and unanswered questions about the software. Project management will want to know about the vendors’ implementation experience, project cost and scope. Other members of the committee may want to make sure the chosen vendor understands your business and its unique needs.

Make the committee’s goals your goals

Anticipate your IT Committee’s concerns and be ready to tackle them to keep your project running smoothly. Build your business case for automated vs. manual processes. Make sure you have proof points and real-world success stories from vendors. And, don’t forget a thorough implementation plan that takes into account training and change management.

Check out our infographic, and get to know who you’ll need in your corner for your P2P automation project. 

Ken Congdon

Health IT’s role in the radiology value equation

Healthcare

Last week at the annual meeting of the RSNA (Radiological Society of North America), thousands of radiology leaders from around the world descended upon Chicago searching for ways their profession can deliver more value to health systems and the patients they serve. With radiology’s transition from a profit center to a cost center for hospitals, the status quo is no longer good enough when it comes to medical imaging. Radiologists must up their game when it comes to quality, cost efficiency and collaboration to extend their value throughout the care continuum.

This universal tone was encapsulated nicely during Monday’s Plenary Session Oration titled Healthcare Transformation: Driving Value Through Imaging, which was delivered by Dr. Vivian S. Lee, CEO of University of Utah Health Care. According to Lee, “value” in healthcare is providing the best care at the lowest cost. During her presentation, she shared a simple equation her organization uses to measure this often intangible attribute — Value = Quality + Service / Cost.

There are several ways University of Utah Health Care applies this equation throughout its enterprise. For example, it tabulates ED, OR, Surgical, ICU and Floor costs for a variety of procedures and compares these figures against patient satisfaction scores to determine cost-to-quality ratios for specific procedures and providers. However, the main point Lee made was there are numerous ways radiology can positively impact this value equation on a day-to-day basis, and almost all of them involve leveraging health IT in new and innovative ways.

For example, Lee pointed out that more than 80 percent of medical imaging costs are tied to labor (Interpretation – 40.1% and Personnel – 39.6%). Using data analytics and process intelligence tools to identify ways to reduce the amount of time this expensive labor is needed can cut the costs of imaging dramatically. These tools can similarly be applied to identify roadblocks in the delivery of imaging studies, helping to accelerate reporting which is an important aspect of a radiology service to both a referring physician and the patient.

However, Lee believes the primary means by which radiologists can drive value is by enabling earlier, more accurate, diagnosis and reducing misdiagnosis. Since diagnostic errors are more costly than treatment mistakes, improving performance in this area can have a cumulative effect on overall value. Therefore, the most important health IT investments for radiology are those that support diagnostic processes.

Investing in newer, more precise, imaging techniques, such as molecular imaging, is one way to improve diagnostic quality. For example, these methods can help better match patients to specific drug treatments or dosages based on their specific molecular makeup, reducing the administration of expensive pharmaceuticals.

However, improving diagnostic quality need not rely on next-generation imaging equipment or procedures. An immediate, measurable impact can be made simply by getting the all the relevant imaging-related information that exists throughout the enterprise into the hands of the clinicians responsible for diagnosing and treating patients.

All too often medical images are stored in silos — whether it’s a radiology PACS/RIS system, a fluoroscopic imaging system, or a pathology imaging system. The clinician caring for the patient rarely has easy access to all of the patient images stored in these various systems. In fact, there’s a high likelihood the clinician doesn’t even know many of these images exist. Making diagnosis and treatment decisions based on incomplete information is a key contributor to misdiagnosis, patient dissatisfaction and higher care costs. Taking an Enterprise Imaging approach that leverages VNA (Vendor Neutral Archive), image connectivity and enterprise viewing technologies eliminates vendor lock-and-block and makes these images accessible from core clinical systems. Employing an Enterprise Imaging strategy that truly puts all images at a clinician’s fingertips can go a long way toward improving patient outcomes and radiology’s overall value.    

For more information on how Enterprise Imaging can benefit not only your radiology department, but your entire healthcare organization, download the new eBook Enterprise Imaging: See what you’ve been missing.

Larry Sitka

RSNA past, present and future: Part 3

Healthcare

Fast forward to RSNA 2020. I now find the waistline completely out of control, but I am diligently working on it thanks to my new genetic profile and a stern lecture from my physician. The ONC Interoperability Roadmap is fully in play. The U.S. legislative branch is approving funding for the Learning Health System under Meaningful Use. Thank you, Dr. De Salvo, for your efforts from 2012 to 2016 for healthcare and its future. We’re now squarely on the path to population health management and departments that used to be revenue streams for the healthcare providers are now cost centers. Now with healthcare costs consuming 20 percent of GDP, our nation is in the midst of a fiscal crisis. Healthcare has become unaffordable. Gone or disappearing are data “lock-and-block” scenarios perpetrated by providers and vendors that refused access to information by outside stakeholders. Those barriers have been obliterated by interoperability-centric technology that includes Healthcare Content Management (HCM) as a platform approach powered by the evolved VNA.

As patients relocate to new addresses, so does the population base for the healthcare delivery organization (HDO). The HDO measures those impacts by linking healthcare evidence documents using powerful search capabilities and an optical character recognition (OCR) engine for care plans through the healthcare content management (HCM) platform for suggestive and perceptive information collection. No longer are we interested in meaningless data on scanned documents. We now demand findings and suggestions, not predictions. Patient content is fed into a grammar-based natural language processing (NLP) platform. Results are fed into a learning machine where the new EHR is going the route of an enterprise viewer layered on top of a data warehouse, thus becoming a display engine for all care plans. The real-time healthcare environment has finally become a reality. Some of us are further ahead than others in this journey to the RSNA of tomorrow.   

The diagram below contains ONC ten-year interoperability goals from the organization’s recently published roadmap report. The good news is that the Lexmark Acuo VNA had already achieved 2020 ONC milestones by 2016:   

The other good news is that it is that I’ve managed to survive another Thanksgiving weekend away from home and Lexmark continues to keep me far away from electrical outlets on the exhibit hall floor. As I reach my quarter century-plus mark in healthcare, the PACS companies are still unable to talk differing dialects of the DICOM speak, but we are finally at the point where healthcare IT experts have realized they must follow in the footsteps of their financial industry colleagues toward interoperability and application independence. The shift from departmental, clinically-based applications to an enterprise approach is well underway. It is now the job of IT to orchestrate the switchover while the service line units enjoy rich access to the patient content management, viewing and workflow required for their environments. This is entirely workable and cost-effective through the discipline of HCM.

Now, before I abandon my keyboard for a delectable leftover turkey and mashed potato sandwich, let me offer three morsels of guidance as you move down your own strategic IT pathway:

  1. Require that all of your vendors sign ONC’s interoperability pledge as a guard against vendor “locking and blocking” in the sharing and exchange of patient content.
  2. Buy at the enterprise level not the departmental level.
  3. Understand that the healthcare applications of tomorrow must have the capability to dynamically discover and ingest clinical content in real-time without requiring data persistence, linking clinical content as part of healthcare content moving forward.

See you on the show floor!