InContext Magazine

your source for solving the unstructured information challenge

Welcome, Jennifer | Log Out

Perceptions

Ken Congdon

Who's Afraid Of The Big Bad Cloud?

The healthcare industry has long kept cloud technology at arm’s length. The cloud simply was not to be trusted. The idea of relinquishing control of valuable patient data from internal server farms to unfamiliar hosted environments was enough to cause palpitations in the hearts of many healthcare CIOs. What if our data is lost or exposed? Will my clinical teams and applications be able to access our data when they need it? How can I be sure a cloud application is HIPAA compliant? These are the types of questions that made moving to the cloud seem way too risky. However, if HIMSS 17 is any indication, this mindset has begun to change in a big way.

Several sessions at HIMSS 17 illustrated that the cloud is not only gaining acceptance in healthcare, it is largely being viewed as a game-changing option vital to reducing costs and improving clinical outcomes. For example, during a session titled Still Afraid Of The Cloud? A CIO’s Journey, Deanna L. Wise, Executive Vice President and CIO at Dignity Health, shared her story of how she evolved from a cloud skeptic to an advocate. During her presentation, she referenced data from an eye-opening HIMSS 2016 Cloud Survey that showed that 84% of provider organizations are currently using the cloud in one way or another. Not surprisingly, the most common cloud use was to host analytics, finance, operations, HR and other data not crucial to clinical care (78%). However, the survey also showed that primary data storage (68%) and clinical application hosting (54%) were gaining significant traction from a cloud perspective. 

A slide during Wise’s HIMSS 17 presentation showing the most common uses of the cloud according to a 2016 HIMSS survey.

Another interesting data point from the HIMSS 2016 Cloud Survey was that “Increasing Performance and Reliability” was the most common response given for healthcare providers adopting cloud technologies. This bucks the historical trend of “Lowering TCO” or “Ease of Management” being the primary reason for moving to the cloud.

Wise admitted gaining a comfort level with the cloud didn’t happen overnight and may not be right for every provider or every application. She stressed that it was a journey. In her case, she gained confidence in the cloud by starting small to build trust in the platform and the provider and ensure the technology performed as expected. She then gradually moved more applications to the cloud and stresses that she envisions ultimately relying on more than one cloud provider.

The HIMSS 17 Cloud Computing Forum also provided strong evidence of the shift to cloud platforms in healthcare. During this assembly, John Houston, Vice President of Information Security and Privacy at University of Pittsburgh Medical Center said that while 90 percent of the health system’s software and 75 percent of its applications are legacy on-premise, only 20 percent of its current contract negotiations are for on-premise solutions. The remaining 80 percent are for cloud-based platforms.

Other providers at the Cloud Computing Forum showed how the switch to the cloud has been truly transformational for their organizations and patients. For example, Richard Stroup, Director of Informatics at Children’s Mercy Hospital in Kansas City shared a story of how the cloud is helping the provider save lives of at-risk infants. Using a cloud-based tablet app, the provider is monitoring infant patients post-surgery after they leave the hospital. In one test case, Children’s Mercy monitored 68 patients that had completed second stage surgery using the app. Thanks to real-time monitoring, there was no interstage mortality. Stroup said that historically they would have expected six to 12 mortalities in that same timeframe without cloud-based monitoring.

The move to the cloud will only continue to increase in healthcare – not only because of the performance, TCO and maintenance benefits cloud technologies offer, but largely out of necessity. With mass EHR adoption and other digitization efforts, the amount of healthcare data is growing exponentially. The rise of genomics, 3D imaging and other data-intensive healthcare applications will push storage and processing requirements to new limits. In fact, IDC states that by 2020, the volume of healthcare data will surpass 2,300 exabytes. Managing this type of data load onsite at healthcare facilities is going to be impractical for most, and impossible for some. Cloud platforms will need to be leveraged to store this critical information and keep patient-centered care initiatives on track.

Ken Congdon

You Can’t Be Patient-Centered Without Complete Patient Data

I, like most of the health IT world, am attending the HIMSS Conference and Exhibition this week in Orlando. Thus far, I’ve been encouraged to see an increased emphasis and focus being placed on the patient at the event this year. Precision medicine and patient-centered care have been central themes in educational sessions as well as on the exhibit floor.

For example, in a session titled Making Health IT Patient-Centered, Tina Esposito, Vice President of the Center for Health Information Services at Advocate Health Care outlined the four-step process the provider is taking to make their health IT systems more patient-centered. What does this mean? Well, for Esposito, patient-centered health IT supports the “Accountable Care Mind Shift” that is occurring not only at Advocate, but in health systems across the country and around the world. This mind shift represents a transition from siloed, episodic care management to population health and value-driven coordinated care.

A slide from Esposito’s HIMSS presentation outlining the Accountable Care Mind Shift.

Advocate’s four-step process is geared toward creating a technology infrastructure that enables specific and comprehensive patient information to be collected and analyzed, so that tailored treatment plans can be administered to each individual. At the highest level, the four steps are as follows:

Step 1: Connect disparate data sources

Step 2: Link sources to a patient (creating a Master Patient Index)

Step 3: Leverage data analytically

Step 4: Engage patients

Most of Esposito’s presentation focused on how Advocate is using analytics to identify and prevent potential illness complications (e.g. referencing key data points to predict when an asthma patient may experience an acute attack and intervening early to prevent a hospital visit). This was all fascinating, but I kept coming back to Step 1 in Advocate’s plan – connecting disparate data sources.

Delivering true patient-centered care requires comprehensive access to all patient data and content. Without it, any patient-centered initiative will fall short. You can’t provide patient-centered care if your information about that patient is incomplete or you don’t have a holistic view of the patient’s medical history. This is important to understand because, quite frankly, most hospitals and health systems don’t have this type of visibility.

So much of the past decade has been focused on EHR adoption and use, positioning this clinical system as the single source for patient information. The problem is, the vast majority of patient data is unstructured (80% according to Gartner) and lives in multiple systems outside and unconnected to the EHR. This type of content includes assets like DICOM medical images (e.g. X-rays, CT scans, MRIs, etc.), non-DICOM medical images (e.g. dermatology and wound photos, endoscopy images/video, surgery video, etc.), clinical documents and more. Plus, new types of patient data are now being generated outside of the provider setting that can have a significant impact on precision medicine and patient health (e.g. genomic data, patient monitoring data, etc.)

Patient-centered care will require a health IT infrastructure capable of ingesting and connecting all of these data sources to the patient through the core clinical systems physicians use every day. It’s time we started to think about the management of unstructured content in the same way we think about the management of EHR-based discrete patient data – at the enterprise level.

A new white paper provides a roadmap for the enterprise-wide management of unstructured patient data and labels the practice Healthcare Content Management (HCM). The concept basically merges Enterprise Imaging and Enterprise Content Management (ECM) technologies into a single, fully integrated infrastructure that makes this content accessible through core clinical systems such as EHRs and well as other enterprise systems. This type of approach is going to prove to be more important as providers move to value-based, patient-centered care.

At HIMSS 17 it’s clear that some providers are beginning to utilize analytics and collaboration to develop treatment plans based on the uniqueness of the individual. However, enabling this type of care all starts by collecting and connecting all types of content both inside and outside the hospital setting, giving physicians access to a complete and real-time view of the patient. When this is done successfully, it can lead to real change and ultimately better outcomes at lower costs.

Ken Congdon

Enterprise Imaging: Supporting today’s health IT demands

Healthcare is on a mission to improve patient outcomes and better manage costs. The shift from volume to value, with its reliance on greater care coordination and collaboration, has given rise to new industry IT demands for centralization, standardization, interoperability and data integrity.

Enterprise Imaging makes it easier for healthcare organizations to meet these demands by eliminating the inefficiency, complexity and roadblocks that prevent easy access to the medical image content needed to drive more informed care decisions and improve outcomes.

Centralized storage, standardized management

Healthcare has historically been a decentralized industry, with departments operating independently of each other and with their own IT systems, applications and formats. As a result, vital medical images often get trapped in departmental silos that are disconnected from core clinical systems and don’t make their way into electronic health records (EHRs). When this occurs, these images aren’t available for physicians at the point of care where crucial treatment decisions are made.

A key piece of technology in an Enterprise Imaging strategy is the vendor neutral archive (VNA). The VNA provides centralized storage and standardized management of all medical content and images, regardless of their origin, native format (DICOM, XDS, JPG, MPG, etc.) or vendor orientation and makes this information available across multiple systems, departments and enterprises. As a result, the VNA provides an essential foundation for delivering a comprehensive image-enabled view of the patient that’s centralized, easily accessible and better supports care decisions.

Sharing imaging data

A recent survey by CHIME found that 54 percent of organizations cannot exchange medical images with recipients outside their four walls. With the emergence of initiatives like Meaningful Use, the Affordable Care Act (ACA), value-based reimbursement and Accountable Care Organizations (ACOs), healthcare organizations must put a premium on interoperability.

When medical images are locked in isolated archives, interoperability, information sharing and image-enabling the patient record is a challenge. Implementing an enterprise viewer that allows viewing of any medical image, imaging report and related patient data anytime and anywhere is a vital step in the Enterprise Imaging journey. With an enterprise viewer, digital image access is no longer confined to the department that created the data. This platform empowers physicians to view any image along with patient content in any format across the enterprise. Such a viewing solution may replace or coexist with a traditional PACS viewer and may be integrated with a VNA or EHR.

Data Integrity

The volume of healthcare data is skyrocketing and expected to reach 15 zetabytes by 2020, which is equivalent to four times the amount of information contained on the Internet. Managing all this data is challenging because nearly 80 percent of it is unstructured data that resides outside of the EHR. Moreover, nearly 75 percent of all medical images today are non-DICOM (e.g. mobile phone images, video clips and images created by isolated modality stations) and therefore aren’t well managed by a traditional PACS. Organizations need an image capture and acquisition solution that works with all departmental systems and with DICOM and non-DICOM interfaces.

A sound Enterprise Imaging solution supports data integrity by performing synchronized updating of metadata (patient/study-level changes) through a journalized approach within the actual image data. It also should allow for automated disaster recovery should the database become unavailable as the database and image content sync. The journal can be used as a historical audit of when, what and how metadata content was changed. Make sure the VNA also has a duplicate-handling process, including the ability to manage duplicates based on your needs. One of the following techniques is required: keep all, keep first, keep last, keep by DICOM tag or keep by CRC and pixel validation.

To be successful in today’s patient-centered environment, hospitals and health systems need to find new ways to improve care quality, enhance the patient experience and lower costs. An Enterprise Imaging strategy provides an excellent opportunity for organizations to work towards those goals. By adopting Enterprise Imaging, hospitals and health systems can ensure medical images get into the hands of physicians and other clinical stakeholders for more informed decision making. As a result, organizations can make significant strides towards improving care quality, enhancing the patient experience and reducing costs. 

To learn more about how Enterprise Imaging can benefit patient care and financial operations, download the eBook – Enterprise Imaging: See What You’ve Been Missing.

Sandra Lillie

Building a business case for Enterprise Imaging

You understand that achieving true patient-centered care and improving outcomes requires that all clinical stakeholders have timely access to more imaging content across care settings. For organizations like yours to thrive, you need visibility into the breadth of medical images contained in picture archiving and communications systems (PACS) and other specialty legacy archives that store non-DICOM medical images and video. You’ve decided that Enterprise Imaging is the way to go, but how do you achieve organizational buy-in and more importantly sell it to your board as well?

Here are three key proof points you’ll need to present in order to demonstrate the benefits and secure the approved budget that is necessary for the initiative:

1. Improves clinical outcomes

With shrinking margins and reimbursement dollars on the line, leaders need to know that your idea will improve the hospital’s ability to care for patients. You’ll want to explain how Enterprise Imaging enables more informed clinical decision-making by allowing healthcare organizations to connect, manage and view medical images both at the clinical point of care and within departments such as radiology, cardiology, ultrasound, surgery and more. Consolidating imaging information throughout the enterprise into a single, standards-based repository can significantly improve patient outcomes by providing a holistic view of patient records, helping reduce potential readmissions and discharge delays, and streamlining workflows for better point-of-care interactions. In addition, online collaboration, including the ability to analyze and share measurements and notes, helps radiologists and other specialists communicate and work more effectively with referring physicians.

2. Enhances security and compliance

With new reports indicating that cyberattacks are on the rise and hackers are increasingly targeting healthcare, privacy and cybersecurity are issues that need to be addressed at the board level. In 2015, one in three Americans were victims of PHI breaches. The Office of Civil Rights (OCR) reported a combined loss of over 112 million records. Moreover, among the top 10 breaches, the vast majority – nearly 90 percent – were due to hacking. Breaches not only damage an organization’s reputation, but with the average cost per lost or stolen record placed at $363, they can also have significant financial impacts.

You will want to explain to your board that an Enterprise Imaging solution that is truly vendor neutral, standards-based and adheres to DICOM and HL7 as well as IHE and XDS framework, ensures data security and HIPAA compliance by getting images off hard drives, disks and USB drives. Further, it establishes centralized control of imaging data, making it easier for IT to manage and secure information, applies security protocols to all images and protects PHI from unwanted exposure.

3. Optimizes new business arrangements

As the hospital consolidation market continues to grow, and boards of directors evaluate new opportunities, they’ll want to know how imaging technology can help them successfully integrate and capitalize on their investments.  

With an Enterprise Imaging strategy in place, organizations have better insight into how to more effectively monitor utilization of services over time and plan service line expansion or retraction. The board will want to know that such an approach supports and optimizes new business arrangements by:

  • Making valuable clinical imaging content available to providers at the right time and in the right way to deliver positive outcomes for patients
  • Consolidating and economizing storage so new hospitals and partners can easily integrate into existing networks and gain access to systems
  • Bringing vital and comprehensive patient information to the  care team by aggregating and integrating studies directly into the patient’s record in the EHR
  • Supporting the divestiture of a facility from the hospital organization through true ownership of the images

Armed with a solid business case, your board will better understand and justify how an investment in Enterprise Imaging can help the organization grow and thrive, enabling it to meet the immediate demands of the enterprise as well as new value-based care models and future business partnerships.

Grant Johnson

The ideal leader profile isn’t what it used to be. Company leaders can no longer sit alone in their offices, far removed from the staff who do the work to keep business humming. Now, leadership at all levels is expected to be more engaged, involved and approachable. Walking the walk is now even more important than just talking the talk. These hands-on traits are especially important as you’re faced with leading a team or an entire organization through digital transformation.

A Harvard Business Review report, Driving Digital Transformation: New Skills for Leaders, New Role for the CIO, found businesses that qualify as digital leaders are more likely than those trailing them to have:

  • Revenue growth over 10%
  • Profit margins that are greater than the industry average
  • A CEO who understands digital opportunities and threats
  • A CIO who is a digital master or digital coach
  • A clearly defined digital vision and strategy
  • Digitally proficient leaders at multiple levels

But what sorts of traits do these aforementioned “digitally proficient leaders” have? It’s one thing to qualify a CEO as “understanding digital opportunities” or a CIO as a “digital master,” but how do those traits translate into a successful digital strategy, and how do other leaders across the organization stand out and succeed in a quickly-changing world?

If you’re looking for ways to get in front of the digital revolution and make yourself irreplaceable as your company digitizes, congratulations –you’re already positioning yourself well to be a digital leader. After all, waiting around for digital transformation to happen to your organization is one of the surest ways to be left behind.

Beyond recognizing that your company needs to start preparing for digital transformation now, here are five other traits you can begin cultivating to make yourself an irreplaceable leader during the process:

1.       Practice agility and flexibility, and learn to recognize it in others. If you’re used to relying on your job description and a static set of skills, you need to shift your mindset. Sure, experience as a CMO and apparent wizardry at Excel pivot tables are important, but with technology, analytics and datasets changing so quickly, it’s even more important to be agile. Become someone who jumps at the chance to learn a new skill or try a new way of doing things. When you’re hiring, look for these traits in potential employees, as well. In the age of digital transformation, quick, enthusiastic, adaptable learners are priceless.

2.       Don’t be afraid to fail, and encourage risk-taking in your team. One major benefit of digitization is that it lends itself well to rapid iteration and trying new things. For example, if you begin building your mobile app one way and realize right after launch that your customers would prefer different features, you’ve lost little time and gained valuable insight to what works.

3.       Listen at scale and provide feedback. It’s impossible to please everyone all the time – a lesson that applies to both your team and your customer base. But listen to the comments, concerns and ideas of both groups – if you begin to hear any commonalities in their opinions, take them seriously. Consider what it would mean to implement practices that your employees and customers are both asking for. Then, let them know they’ve been heard. Even if the change they’re asking for isn’t possible, it’s important to let them know their leaders are listening.

4.       Break down departmental barriers. When you hire agile, flexible, innovative thinkers, they don’t want to work in silos. They want to bounce their ideas off coworkers who can help make them reality. These notions aren’t folly, and they won’t discourage productivity. Instead, allowing your employees to test out new ideas can help keep them engaged at work and lead to some brilliant outcomes. Encouraging collaboration and invention among your people is a hallmark of an invaluable digital leader.

5.       Focus on customer experience, and learn to see the end rather than the means. It’s easy be consumed meeting all your quarterly target numbers, such as leads, web traffic, click-thrus and conversion rates. While these metrics matter very much – they’re how you measure progress, profitability and goal attainment – they aren’t the entire end game. No matter what your organization does, your ultimate success depends on customer experience., from first content, to ongoing engagement. Keep your focus on the customer, allocate appropriate resources, and make sure your teams are doing the same.  

Sarah Bajek

Build your purchase-to-pay outperformance entourage

You’ve done the research, you ran the numbers, and you know that your accounts payable and purchasing departments are capable of outperforming expectations through stronger collaboration and the right purchase-to-pay automation technology. Now it’s time to implement the changes and find the solutions to make it happen – but you can’t do it alone.

Meet the IT committee

If your organization is like most medium to large-size companies, the decision-making process and selection of your P2P automation technology will be handled by a cross-functional committee made up of a diverse mix of departments and backgrounds. Of course, you’re leading the charge, IT will play a big role, and you’ll probably need a green light from executive management. But who else do you need on your side?

According to research from LinkedIn®, technology decision making goes beyond the senior IT department. In fact, they found that while IT is heavily involved throughout the entire project, 78% of the IT committee works outside of IT.

Most IT committees include these members:

  • Application managers are typically leaders of business units for other programs that will need to work seamlessly with your P2P technology, such as your ERP or CRM systems.
  • Enterprise architects are responsible for running servers and other foundational elements of the enterprise.
  • Project managers keep the project on track and manage the implementation.
  • Other department managers act as the intermediaries between technical staff and senior leadership.

Their goals

Your IT committee is there to evaluate the cost and scope of this project with varying interests. While evaluating vendors and solutions, you’ll focus on optimizing your P2P processes. IT will dig into the technology and unanswered questions about the software. Project management will want to know about the vendors’ implementation experience, project cost and scope. Other members of the committee may want to make sure the chosen vendor understands your business and its unique needs.

Make the committee’s goals your goals

Anticipate your IT Committee’s concerns and be ready to tackle them to keep your project running smoothly. Build your business case for automated vs. manual processes. Make sure you have proof points and real-world success stories from vendors. And, don’t forget a thorough implementation plan that takes into account training and change management.

Check out our infographic, and get to know who you’ll need in your corner for your P2P automation project. 

Ken Congdon

Health IT’s role in the radiology value equation

Last week at the annual meeting of the RSNA (Radiological Society of North America), thousands of radiology leaders from around the world descended upon Chicago searching for ways their profession can deliver more value to health systems and the patients they serve. With radiology’s transition from a profit center to a cost center for hospitals, the status quo is no longer good enough when it comes to medical imaging. Radiologists must up their game when it comes to quality, cost efficiency and collaboration to extend their value throughout the care continuum.

This universal tone was encapsulated nicely during Monday’s Plenary Session Oration titled Healthcare Transformation: Driving Value Through Imaging, which was delivered by Dr. Vivian S. Lee, CEO of University of Utah Health Care. According to Lee, “value” in healthcare is providing the best care at the lowest cost. During her presentation, she shared a simple equation her organization uses to measure this often intangible attribute — Value = Quality + Service / Cost.

There are several ways University of Utah Health Care applies this equation throughout its enterprise. For example, it tabulates ED, OR, Surgical, ICU and Floor costs for a variety of procedures and compares these figures against patient satisfaction scores to determine cost-to-quality ratios for specific procedures and providers. However, the main point Lee made was there are numerous ways radiology can positively impact this value equation on a day-to-day basis, and almost all of them involve leveraging health IT in new and innovative ways.

For example, Lee pointed out that more than 80 percent of medical imaging costs are tied to labor (Interpretation – 40.1% and Personnel – 39.6%). Using data analytics and process intelligence tools to identify ways to reduce the amount of time this expensive labor is needed can cut the costs of imaging dramatically. These tools can similarly be applied to identify roadblocks in the delivery of imaging studies, helping to accelerate reporting which is an important aspect of a radiology service to both a referring physician and the patient.

However, Lee believes the primary means by which radiologists can drive value is by enabling earlier, more accurate, diagnosis and reducing misdiagnosis. Since diagnostic errors are more costly than treatment mistakes, improving performance in this area can have a cumulative effect on overall value. Therefore, the most important health IT investments for radiology are those that support diagnostic processes.

Investing in newer, more precise, imaging techniques, such as molecular imaging, is one way to improve diagnostic quality. For example, these methods can help better match patients to specific drug treatments or dosages based on their specific molecular makeup, reducing the administration of expensive pharmaceuticals.

However, improving diagnostic quality need not rely on next-generation imaging equipment or procedures. An immediate, measurable impact can be made simply by getting the all the relevant imaging-related information that exists throughout the enterprise into the hands of the clinicians responsible for diagnosing and treating patients.

All too often medical images are stored in silos — whether it’s a radiology PACS/RIS system, a fluoroscopic imaging system, or a pathology imaging system. The clinician caring for the patient rarely has easy access to all of the patient images stored in these various systems. In fact, there’s a high likelihood the clinician doesn’t even know many of these images exist. Making diagnosis and treatment decisions based on incomplete information is a key contributor to misdiagnosis, patient dissatisfaction and higher care costs. Taking an Enterprise Imaging approach that leverages VNA (Vendor Neutral Archive), image connectivity and enterprise viewing technologies eliminates vendor lock-and-block and makes these images accessible from core clinical systems. Employing an Enterprise Imaging strategy that truly puts all images at a clinician’s fingertips can go a long way toward improving patient outcomes and radiology’s overall value.    

For more information on how Enterprise Imaging can benefit not only your radiology department, but your entire healthcare organization, download the new eBook Enterprise Imaging: See what you’ve been missing.

Larry Sitka

RSNA past, present and future: Part 3

Fast forward to RSNA 2020. I now find the waistline completely out of control, but I am diligently working on it thanks to my new genetic profile and a stern lecture from my physician. The ONC Interoperability Roadmap is fully in play. The U.S. legislative branch is approving funding for the Learning Health System under Meaningful Use. Thank you, Dr. De Salvo, for your efforts from 2012 to 2016 for healthcare and its future. We’re now squarely on the path to population health management and departments that used to be revenue streams for the healthcare providers are now cost centers. Now with healthcare costs consuming 20 percent of GDP, our nation is in the midst of a fiscal crisis. Healthcare has become unaffordable. Gone or disappearing are data “lock-and-block” scenarios perpetrated by providers and vendors that refused access to information by outside stakeholders. Those barriers have been obliterated by interoperability-centric technology that includes Healthcare Content Management (HCM) as a platform approach powered by the evolved VNA.

As patients relocate to new addresses, so does the population base for the healthcare delivery organization (HDO). The HDO measures those impacts by linking healthcare evidence documents using powerful search capabilities and an optical character recognition (OCR) engine for care plans through the healthcare content management (HCM) platform for suggestive and perceptive information collection. No longer are we interested in meaningless data on scanned documents. We now demand findings and suggestions, not predictions. Patient content is fed into a grammar-based natural language processing (NLP) platform. Results are fed into a learning machine where the new EHR is going the route of an enterprise viewer layered on top of a data warehouse, thus becoming a display engine for all care plans. The real-time healthcare environment has finally become a reality. Some of us are further ahead than others in this journey to the RSNA of tomorrow.   

The diagram below contains ONC ten-year interoperability goals from the organization’s recently published roadmap report. The good news is that the Lexmark Acuo VNA had already achieved 2020 ONC milestones by 2016:   

The other good news is that it is that I’ve managed to survive another Thanksgiving weekend away from home and Lexmark continues to keep me far away from electrical outlets on the exhibit hall floor. As I reach my quarter century-plus mark in healthcare, the PACS companies are still unable to talk differing dialects of the DICOM speak, but we are finally at the point where healthcare IT experts have realized they must follow in the footsteps of their financial industry colleagues toward interoperability and application independence. The shift from departmental, clinically-based applications to an enterprise approach is well underway. It is now the job of IT to orchestrate the switchover while the service line units enjoy rich access to the patient content management, viewing and workflow required for their environments. This is entirely workable and cost-effective through the discipline of HCM.

Now, before I abandon my keyboard for a delectable leftover turkey and mashed potato sandwich, let me offer three morsels of guidance as you move down your own strategic IT pathway:

  1. Require that all of your vendors sign ONC’s interoperability pledge as a guard against vendor “locking and blocking” in the sharing and exchange of patient content.
  2. Buy at the enterprise level not the departmental level.
  3. Understand that the healthcare applications of tomorrow must have the capability to dynamically discover and ingest clinical content in real-time without requiring data persistence, linking clinical content as part of healthcare content moving forward.

See you on the show floor!

Phil Wasson

The advent of the true VNA: Part II

You might think you have unstructured healthcare content under control, but chances are you don’t. Most content management deployments simply create new silos. Here’s how a true VNA can help.

Healthcare Delivery Organizations (HDOs) have made monumental strides in embracing and deploying health information technology over the past 15 to 20 years. It seems like only yesterday we were discussing how important it was to eliminate the silos of clinical information that existed throughout the HDO. Accessing these systems was complex and required unique applications that often provided only narrow departmental benefits. Successful EHR implementations have allowed healthcare facilities to integrate clinical information across their organizations. It’s amazing how far we have come over this period and how HDOs are better positioned today for the major changes on the horizon for healthcare service delivery. However, several challenges remain when it comes to harnessing much of the unstructured information needed for healthcare decision making. Despite the millions of dollars invested in IT by HDOs, a frightening amount of unstructured data is still unmanaged.

One of the primary reasons we still face this problem is that many of the unstructured document management and imaging solutions implemented today are deployed under a departmental model. These systems may have been acquired to satisfy specific departmental needs, but shackling these systems to individual departments only serves to perpetuate the silo effect. This approach contributes to the lack of healthcare content management of unstructured information within today’s HDO.1 Furthermore, positioning EHRs as the sole solution to access all patient content isn’t enough to position a provider for changes in healthcare delivery and reimbursement. With the advent of a true VNA, HDOs have an opportunity to take ownership of their imaging data and reduce the cost of imaging throughout their entire enterprise.  

In my last post, I discussed the advent of the true VNA and how that system has emerged over the last 15 years to address the need to store, access and manage medical images.  The VNA has offered an important differentiator to those organizations that saw the value of this technology and deployed it to address the proprietary practices of many PACS vendors. The VNA offers an integrated solution to stabilize the HDO’s imaging systems, improve the opportunity to interoperate with other systems and improve the overall management of all the HDO’s unstructured content. This capability, known as Healthcare Content Management (HCM) can eliminate many of the imaging and document silos that continue to exist within HDOs.

Considering these facts, I’d argue that we’ve reached a point where we once again need to focus on eliminating silos to enable better integration of systems. This time, however, the focus needs to be placed on replacing siloed, departmental unstructured content management systems with a centralized approach where all clinical images and documents throughout the HDO are managed by a common HCM platform. We also need to recognize that HCM and its ability to manage all unstructured clinical data is an important component to the EHR.

The true VNA can put the HDO on an HCM path that can immediately reduce cost through the consolidation of imaging silos within their organization. The VNA can also offer an interoperable solution that enables the capture of both DICOM and non-DICOM content. The true VNA, when combined with visualization tools for enterprise-wide viewing, ECM systems for document capture and management and XDS capabilities for information sharing, provides HDOs with a flexible platform that allows them to quickly adjust to the coming changes in healthcare delivery and reimbursement.

You can learn more about the capabilities that define a true VNA by reviewing our Definitive VNA Checklist.

Larry Sitka

RSNA past, present and future: Part 2

Begin by reading Part 1 here.

Time marches on and, in 2002, a slightly less slim version of me can be found hunched over a keyboard at Acuo Technologies, keeping my fingers crossed that the third software build was the charm. And voila, an application we proud parents called the “vendor neutral archive (VNA)” was born to the medical imaging world. “Oh, you don’t need one of those,” said PACS vendors almost in unison. That is, until a few years down the road, when suddenly, virtually overnight, nearly all of those same companies miraculously had their own version of a VNA (funny how marketing works, isn’t it?).

Now here we are knocking at the door of McCormick Place in 2016 and the Acuo VNA team is thriving underneath the Lexmark Healthcare umbrella. A sneak peek behind the RSNA exhibit hall doors would reveal the shiny VNA label emblazoned across the booth properties of nearly every PACS and data storage vendor in sight. What these vendor communities do not realize, however, is that what we were calling a VNA back in 2007 has evolved to an advanced, service-oriented platform that today supports a new discipline known as healthcare content management (HCM). HCM provides a means for not only storing unstructured patient content, but also the ability to dynamically link and automatically discover all healthcare content evidence documents across the enterprise. These capabilities combine to reveal a consolidated, patient-centric view of the continuum of care.  The focus of the HCM platform is enterprise delivery of health information to the devices used most by caregivers and patients. HCM moves us far beyond department boundaries to a focus on capturing, managing and displaying any content, not just medical images. 

This approach is driving the separation of PACS functionality into what I call “PACS redefined.” Healthcare technologists are beginning to wrap their minds around the idea that PACS and PACS-like core functionality is not going away, but is moving out of the departments and into the enterprise. Removing radiology-centric requirements across other ‘ologies helps streamline workflows and disconnects the application in all departments from the data. This not only makes possible new approaches and ease of development within the DICOM standard, but also creates a much better data consistency/canonical data model and delivers superior performance to the desktop for clinicians and physicians.

“Why would I want to consider this?” you might ask. Simply put, to bring evidence documents into the reach of this new billion-dollar application called an EHR. Image-enabling the EHR is a primary purpose for healthcare content management similar to the way the television industry has moved from standard definition to HD and, ultimately, Ultra HD TV. The HCM platform allows me to dynamically link new content, including pathology reports and exams, genetic health reports and medication susceptibility reports at an enterprise level. This information is can be based on individual patient genomes and secure access can be easily extended to key clinical stakeholders and even patients themselves. 

This may all sound a little scary, but don’t fear. HCM has taken us from the department, where undiscovered and underutilized clinical content exists, to being able to centralize and securely access all patient content at the enterprise level. This also means the HCM platform is not responsible for storing all patient content, nor should it be expected to. After all, new content is constantly arriving somewhere within a healthcare enterprise. The old concept of expecting to register something before third parties know about it is a dying paradigm.

So, on goes RSNA, my perpetual celebration of Thanksgiving in the Windy City. Our setup team stays far away from all the electrical outlets. The PACS/VNA “me too” companies all continue to claim they do it all and, once again, “real software is still not available.”