InContext Magazine

your source for solving the unstructured information challenge

Welcome, Jennifer | Log Out

Perceptions

Ken Congdon

Health IT’s role in the radiology value equation

Healthcare

Last week at the annual meeting of the RSNA (Radiological Society of North America), thousands of radiology leaders from around the world descended upon Chicago searching for ways their profession can deliver more value to health systems and the patients they serve. With radiology’s transition from a profit center to a cost center for hospitals, the status quo is no longer good enough when it comes to medical imaging. Radiologists must up their game when it comes to quality, cost efficiency and collaboration to extend their value throughout the care continuum.

This universal tone was encapsulated nicely during Monday’s Plenary Session Oration titled Healthcare Transformation: Driving Value Through Imaging, which was delivered by Dr. Vivian S. Lee, CEO of University of Utah Health Care. According to Lee, “value” in healthcare is providing the best care at the lowest cost. During her presentation, she shared a simple equation her organization uses to measure this often intangible attribute — Value = Quality + Service / Cost.

There are several ways University of Utah Health Care applies this equation throughout its enterprise. For example, it tabulates ED, OR, Surgical, ICU and Floor costs for a variety of procedures and compares these figures against patient satisfaction scores to determine cost-to-quality ratios for specific procedures and providers. However, the main point Lee made was there are numerous ways radiology can positively impact this value equation on a day-to-day basis, and almost all of them involve leveraging health IT in new and innovative ways.

For example, Lee pointed out that more than 80 percent of medical imaging costs are tied to labor (Interpretation – 40.1% and Personnel – 39.6%). Using data analytics and process intelligence tools to identify ways to reduce the amount of time this expensive labor is needed can cut the costs of imaging dramatically. These tools can similarly be applied to identify roadblocks in the delivery of imaging studies, helping to accelerate reporting which is an important aspect of a radiology service to both a referring physician and the patient.

However, Lee believes the primary means by which radiologists can drive value is by enabling earlier, more accurate, diagnosis and reducing misdiagnosis. Since diagnostic errors are more costly than treatment mistakes, improving performance in this area can have a cumulative effect on overall value. Therefore, the most important health IT investments for radiology are those that support diagnostic processes.

Investing in newer, more precise, imaging techniques, such as molecular imaging, is one way to improve diagnostic quality. For example, these methods can help better match patients to specific drug treatments or dosages based on their specific molecular makeup, reducing the administration of expensive pharmaceuticals.

However, improving diagnostic quality need not rely on next-generation imaging equipment or procedures. An immediate, measurable impact can be made simply by getting the all the relevant imaging-related information that exists throughout the enterprise into the hands of the clinicians responsible for diagnosing and treating patients.

All too often medical images are stored in silos — whether it’s a radiology PACS/RIS system, a fluoroscopic imaging system, or a pathology imaging system. The clinician caring for the patient rarely has easy access to all of the patient images stored in these various systems. In fact, there’s a high likelihood the clinician doesn’t even know many of these images exist. Making diagnosis and treatment decisions based on incomplete information is a key contributor to misdiagnosis, patient dissatisfaction and higher care costs. Taking an Enterprise Imaging approach that leverages VNA (Vendor Neutral Archive), image connectivity and enterprise viewing technologies eliminates vendor lock-and-block and makes these images accessible from core clinical systems. Employing an Enterprise Imaging strategy that truly puts all images at a clinician’s fingertips can go a long way toward improving patient outcomes and radiology’s overall value.    

For more information on how Enterprise Imaging can benefit not only your radiology department, but your entire healthcare organization, download the new eBook Enterprise Imaging: See what you’ve been missing.

Larry Sitka

RSNA past, present and future: Part 3

Healthcare

Fast forward to RSNA 2020. I now find the waistline completely out of control, but I am diligently working on it thanks to my new genetic profile and a stern lecture from my physician. The ONC Interoperability Roadmap is fully in play. The U.S. legislative branch is approving funding for the Learning Health System under Meaningful Use. Thank you, Dr. De Salvo, for your efforts from 2012 to 2016 for healthcare and its future. We’re now squarely on the path to population health management and departments that used to be revenue streams for the healthcare providers are now cost centers. Now with healthcare costs consuming 20 percent of GDP, our nation is in the midst of a fiscal crisis. Healthcare has become unaffordable. Gone or disappearing are data “lock-and-block” scenarios perpetrated by providers and vendors that refused access to information by outside stakeholders. Those barriers have been obliterated by interoperability-centric technology that includes Healthcare Content Management (HCM) as a platform approach powered by the evolved VNA.

As patients relocate to new addresses, so does the population base for the healthcare delivery organization (HDO). The HDO measures those impacts by linking healthcare evidence documents using powerful search capabilities and an optical character recognition (OCR) engine for care plans through the healthcare content management (HCM) platform for suggestive and perceptive information collection. No longer are we interested in meaningless data on scanned documents. We now demand findings and suggestions, not predictions. Patient content is fed into a grammar-based natural language processing (NLP) platform. Results are fed into a learning machine where the new EHR is going the route of an enterprise viewer layered on top of a data warehouse, thus becoming a display engine for all care plans. The real-time healthcare environment has finally become a reality. Some of us are further ahead than others in this journey to the RSNA of tomorrow.   

The diagram below contains ONC ten-year interoperability goals from the organization’s recently published roadmap report. The good news is that the Lexmark Acuo VNA had already achieved 2020 ONC milestones by 2016:   

The other good news is that it is that I’ve managed to survive another Thanksgiving weekend away from home and Lexmark continues to keep me far away from electrical outlets on the exhibit hall floor. As I reach my quarter century-plus mark in healthcare, the PACS companies are still unable to talk differing dialects of the DICOM speak, but we are finally at the point where healthcare IT experts have realized they must follow in the footsteps of their financial industry colleagues toward interoperability and application independence. The shift from departmental, clinically-based applications to an enterprise approach is well underway. It is now the job of IT to orchestrate the switchover while the service line units enjoy rich access to the patient content management, viewing and workflow required for their environments. This is entirely workable and cost-effective through the discipline of HCM.

Now, before I abandon my keyboard for a delectable leftover turkey and mashed potato sandwich, let me offer three morsels of guidance as you move down your own strategic IT pathway:

  1. Require that all of your vendors sign ONC’s interoperability pledge as a guard against vendor “locking and blocking” in the sharing and exchange of patient content.
  2. Buy at the enterprise level not the departmental level.
  3. Understand that the healthcare applications of tomorrow must have the capability to dynamically discover and ingest clinical content in real-time without requiring data persistence, linking clinical content as part of healthcare content moving forward.

See you on the show floor!

Phil Wasson

The advent of the true VNA: Part II

Vendor Neutral Archive (VNA)

You might think you have unstructured healthcare content under control, but chances are you don’t. Most content management deployments simply create new silos. Here’s how a true VNA can help.

Healthcare Delivery Organizations (HDOs) have made monumental strides in embracing and deploying health information technology over the past 15 to 20 years. It seems like only yesterday we were discussing how important it was to eliminate the silos of clinical information that existed throughout the HDO. Accessing these systems was complex and required unique applications that often provided only narrow departmental benefits. Successful EHR implementations have allowed healthcare facilities to integrate clinical information across their organizations. It’s amazing how far we have come over this period and how HDOs are better positioned today for the major changes on the horizon for healthcare service delivery. However, several challenges remain when it comes to harnessing much of the unstructured information needed for healthcare decision making. Despite the millions of dollars invested in IT by HDOs, a frightening amount of unstructured data is still unmanaged.

One of the primary reasons we still face this problem is that many of the unstructured document management and imaging solutions implemented today are deployed under a departmental model. These systems may have been acquired to satisfy specific departmental needs, but shackling these systems to individual departments only serves to perpetuate the silo effect. This approach contributes to the lack of healthcare content management of unstructured information within today’s HDO.1 Furthermore, positioning EHRs as the sole solution to access all patient content isn’t enough to position a provider for changes in healthcare delivery and reimbursement. With the advent of a true VNA, HDOs have an opportunity to take ownership of their imaging data and reduce the cost of imaging throughout their entire enterprise.  

In my last post, I discussed the advent of the true VNA and how that system has emerged over the last 15 years to address the need to store, access and manage medical images.  The VNA has offered an important differentiator to those organizations that saw the value of this technology and deployed it to address the proprietary practices of many PACS vendors. The VNA offers an integrated solution to stabilize the HDO’s imaging systems, improve the opportunity to interoperate with other systems and improve the overall management of all the HDO’s unstructured content. This capability, known as Healthcare Content Management (HCM) can eliminate many of the imaging and document silos that continue to exist within HDOs.

Considering these facts, I’d argue that we’ve reached a point where we once again need to focus on eliminating silos to enable better integration of systems. This time, however, the focus needs to be placed on replacing siloed, departmental unstructured content management systems with a centralized approach where all clinical images and documents throughout the HDO are managed by a common HCM platform. We also need to recognize that HCM and its ability to manage all unstructured clinical data is an important component to the EHR.

The true VNA can put the HDO on an HCM path that can immediately reduce cost through the consolidation of imaging silos within their organization. The VNA can also offer an interoperable solution that enables the capture of both DICOM and non-DICOM content. The true VNA, when combined with visualization tools for enterprise-wide viewing, ECM systems for document capture and management and XDS capabilities for information sharing, provides HDOs with a flexible platform that allows them to quickly adjust to the coming changes in healthcare delivery and reimbursement.

You can learn more about the capabilities that define a true VNA by reviewing our Definitive VNA Checklist.

Larry Sitka

RSNA past, present and future: Part 2

Healthcare

Begin by reading Part 1 here.

Time marches on and, in 2002, a slightly less slim version of me can be found hunched over a keyboard at Acuo Technologies, keeping my fingers crossed that the third software build was the charm. And voila, an application we proud parents called the “vendor neutral archive (VNA)” was born to the medical imaging world. “Oh, you don’t need one of those,” said PACS vendors almost in unison. That is, until a few years down the road, when suddenly, virtually overnight, nearly all of those same companies miraculously had their own version of a VNA (funny how marketing works, isn’t it?).

Now here we are knocking at the door of McCormick Place in 2016 and the Acuo VNA team is thriving underneath the Lexmark Healthcare umbrella. A sneak peek behind the RSNA exhibit hall doors would reveal the shiny VNA label emblazoned across the booth properties of nearly every PACS and data storage vendor in sight. What these vendor communities do not realize, however, is that what we were calling a VNA back in 2007 has evolved to an advanced, service-oriented platform that today supports a new discipline known as healthcare content management (HCM). HCM provides a means for not only storing unstructured patient content, but also the ability to dynamically link and automatically discover all healthcare content evidence documents across the enterprise. These capabilities combine to reveal a consolidated, patient-centric view of the continuum of care.  The focus of the HCM platform is enterprise delivery of health information to the devices used most by caregivers and patients. HCM moves us far beyond department boundaries to a focus on capturing, managing and displaying any content, not just medical images. 

This approach is driving the separation of PACS functionality into what I call “PACS redefined.” Healthcare technologists are beginning to wrap their minds around the idea that PACS and PACS-like core functionality is not going away, but is moving out of the departments and into the enterprise. Removing radiology-centric requirements across other ‘ologies helps streamline workflows and disconnects the application in all departments from the data. This not only makes possible new approaches and ease of development within the DICOM standard, but also creates a much better data consistency/canonical data model and delivers superior performance to the desktop for clinicians and physicians.

“Why would I want to consider this?” you might ask. Simply put, to bring evidence documents into the reach of this new billion-dollar application called an EHR. Image-enabling the EHR is a primary purpose for healthcare content management similar to the way the television industry has moved from standard definition to HD and, ultimately, Ultra HD TV. The HCM platform allows me to dynamically link new content, including pathology reports and exams, genetic health reports and medication susceptibility reports at an enterprise level. This information is can be based on individual patient genomes and secure access can be easily extended to key clinical stakeholders and even patients themselves. 

This may all sound a little scary, but don’t fear. HCM has taken us from the department, where undiscovered and underutilized clinical content exists, to being able to centralize and securely access all patient content at the enterprise level. This also means the HCM platform is not responsible for storing all patient content, nor should it be expected to. After all, new content is constantly arriving somewhere within a healthcare enterprise. The old concept of expecting to register something before third parties know about it is a dying paradigm.

So, on goes RSNA, my perpetual celebration of Thanksgiving in the Windy City. Our setup team stays far away from all the electrical outlets. The PACS/VNA “me too” companies all continue to claim they do it all and, once again, “real software is still not available.” 

Larry Sitka

RSNA past, present and future: Part 1

Healthcare

Roll back the clock 25 years and you find a much slimmer version of me abandoning my cubicle at AT&T for a cool new vertical at 3M (yes, the tape and laser printer company) called “digital asset management and imaging services.” Before I could even unpack my lava lamp, the new boss slapped a document down in front of me with terms like “ACR/NEMA 2.0” and “DICOM 3.0 preliminary spec” splattered across the page. “Is it too late to back out?” I thought as I glanced around for an elevator.

Fortunately I came to my senses, took a big gulp of coffee and conjured up an expression of steely-eyed confidence as my new colleagues stared to see what I would do next. My mission was to take the contents of those cryptic documents and implement them in software. “Simple task,” I thought, given that my prior job included implementing from other standards like token ring and ATM inside an SNA network. “Piece of cake,” I chuckled to myself. “How hard can it be?” I’ve implemented this type of stuff in my sleep, many times testing the software inside the IEEE protocol test suites.

Well…not so fast. The DICOM standard turned out to be quite loose in definition, extremely open to interpretation and there was no protocol test suite to be found. So, interoperability wasn’t going to happen right out of the gates. Eventually two organizations – CEN in over in Europe and Mallinckrodt University in St. Louis (hats off to my beloved colleague Steve Moore) – completed successful product implementations of the DICOM spec to test against. However, even those two implementations disagreed with one another down in the deep, dark recesses of the old RSNA building.  From the very start, the promise of interoperability presented a huge challenge. 

Each vendor, ourselves included, executed our own implementations and talked to our own software without a hitch. “Perfect,” we said, “What’s so hard about this? All seems to be working just fine and there are no arguments within our code stack when it talks to itself.” I am proud to say that to this day, I have never had an argument with myself.

The lack of interoperability and well-defined testing mechanisms drove the DICOM standard and it, in turn, drove this thing called a PACS. “Pay for your PACS via film credits,” was the sales pitch. “If the PACS fails just go back to film and print again.”  Those PACS systems continued to evolve and because of their expense, although extremely high, typically added value in delivery of images to the radiologist workstations and magnitude of monitors, thus delivering a better patient outcome. Sound familiar?

Now, I’ve got a confession to make. Jumping onto a plane on Thanksgiving Day did not sit too well with my wife and one-year-old daughter (and never has over the last quarter century). In fact, if someone told me I would be doing this same routine for the next 25+ years, I would have called them cuckoo!

So on Black Friday morning, I met up with the rest of the 3M team to travel over to a building called McCormick Place. It was cold, filled with carbon monoxide and a confused mass of forklift trucks and big wooden boxes as far as the eye could see. I’m pretty certain if the Pilgrims had had this to deal with, they would have grabbed their turkey, shook hands with Squanto and Pocahontas, and pointed the Mayflower back around to where it came from.

In the midst of this chaos, the union workers hauled in these massive machines, weighing zillions of tons each, that could capture images digitally, along with probably 1,000 printers. “But where do you store all these images?” I asked myself. My question was quickly resolved when a large truck, carrying a big glass room containing a gizmo with robotic arms, made its way into the exhibit hall. “What does this thing do?” I asked. The gentleman who had just finished setting it up said, “Watch.” The robotic arms came to life and began snatching magnetic tapes off the racks that were attached to the wall. They loaded the tapes into some drive bays and moved other tapes into some slots on the wall. “That’s pretty cool,” I remarked. “But what is it really doing?” “Oh, it’s just in demo mode,” he said. “It really isn’t connected to anything.” Talk about a letdown. I felt like a kid that had just had his favorite Christmas toy taken away from him.    

So finally, after seven straight days of running demonstrations of the new software called “DICOM” for radiologists and getting put into the “penalty box” for not allowing the union electricians to plug my plug into the wall socket (sorry, by the way, I was not aware of this rule), I went home, plopped into my recliner scratching my head and asking “What in the heck did I just witness?”

What I saw in Chicago that Thanksgiving weekend, oh so long ago, was a host of different companies, all claiming to support DICOM, but none of them communicating with each other independently. And those that did claim to talk to others could only do it in “demo mode.” What I also witnessed was zero interoperability unless the DICOM software was communicating with itself. Whenever I pushed for answers, I was quickly informed by now-retired colleagues (Moe Auger, Chris Bull and Dr. Michael McQuade): “Didn’t you know? RSNA stands for Real Software Not Available.” To think I gave up the Lions and the Packers game for this.

Jeremy McNeive

Live from RSNA 2016

Healthcare

Welcome to the Lexmark Healthcare RSNA16 live blog. Visit us here often during RSNA in Chicago (Nov 27 – Dec 1) for updates and news from the world's largest Radiology trade show. 

More to come later today!!

Wednesday, 10:00 am

What's the future of RSNA going to look like? One expert gives his predictions in Part 3 of RSNA past, present and future. Start by reading his insights into the past and present of the industry, then imagine what we'll all be talking about in 2020. 

Tuesday, November 29, 3:00 pm

You might think you have unstructured healthcare content under control, but chances are you don’t. Most content management deployments simply create new silos. Healthcare expert Phil Wasson has thoughts on how a true VNA can help.

Tuesday, November 29, 1:00 pm

Lexmark Healthcare is in the media spotlight at RSNA 2016! Check out these behind-the-scenes shots from interviews with Radiology Today and ITN (Imaging Technology News): 

Monday, 3:45 pm

Chris Carr, Director of Informatics, RSNA talks about the RSNA Image Share Validation project, in which Lexmark Healthcare was one of seven vendors to earn the initial RSNA Image Share Validation seal: 

Monday, 2:00 pm

Lexmark is proud to be one of seven vendors honored by RSNA and The Sequoia Project for successfully completing the RSNA Image Share Validation program. The program "rigorously tests the compliance of vendors’ systems to accurately and efficiently exchange medical images," 

"Vendors who earn the RSNA Image Share Validation seal are demonstrating commitment to improve access to imaging records and enabling better-informed decisions about patient care, while also improving patient safety by eliminating redundant radiology procedures, reducing operational costs and relieving the burden of responsibility from the patient," says The Sequoia Project.

We couldn't be more pleased to have recognition for the commitment to imaging access that our customers have enjoyed for years. Thanks, RSNA and The Sequoia Project!

Monday, 10:00 am

Larry Sitka's next stop on the journey into the past of RSNA begins in the year 2002 and covers the journey from department to enterprise. 

If you didn't get the chance to read part one of his series, start there!

Sunday, November 27, 6:00 pm

Lexmark joins more than 650 exhibiting companies at RSNA to welcome 51,000+ radiology professionals from around the world. Visit us at booth 3300 and be greeted by these smiling faces at the Lexmark welcome counter: 

Sunday, November 27, 2:00 pm

Droves of eager attendees are ready to go "Beyond Imaging" at this year's annual meeting and exhibition of the Radiological Society of North America (RSNA): 

Sunday, November 27, 10:00 am

Join Larry Sitka, founder of Acuo Technologies and Principal Solution Architect at Lexmark Healthcare, as he takes a journey through the past, present and future of RSNA. Today, he's looking all the way back to 1990, the year he considers the onset of standardization ... kind of. 

Grant Johnson

In the age of the customer, experience is paramount. While nearly every company in business today understands the important of customer experience, many are still struggling with digitally transforming their business to effectively to win, retain and grow customers. I’ve been passionate about customers since I began my career in marketing, and customer experience has been a topic of discussion for several years, so when I read a recent Brian Solis post on customer experience (CX), it resonated and got me thinking about what has changed in how companies approach CX since we entered this decade.  

Solis observes that “customer Experience (CX) is a difficult process because so many stakeholders interpret CX differently and then prioritize investments and resources accordingly.” This is consistent with a point I made in a blog more than five years ago. In Who Owns the Customer, I noted that a lack of true customer ownership happens because most companies are organized along functional lines (i.e., sales, marketing, services, IT, support, finance, etc.), and there is no cross-functional ownership approach that enables the organization to holistically manage customer relationships.

What complicates the goal of customer ownership further is that each functional area often has disparate systems and databases that gather and store customer information.  Moreover, these functions do not consolidate info in a meaningful way, or have the proper analytics or data integration technology to leverage key insights from disparate data sources. This fragmented approach also undermines taking advantage of modern digital platforms that gather customer feedback, from websites to social media and online communities.

Achieving a unified view of the customer is a daunting task. While at Forrester, Paul Hagan wrote, “most often, companies shopping for CRM systems are accosted with solutions that promise a 360-degree view of customers…the reality is that the customer experience is far broader than that . . . and so is the ecosystem of technologies required to support them.” For the few companies (whether B2C or B2B) that have been able to bring all relevant customer information together, there is still an organizational and/or structural gap that inhibits customer centricity because in most companies there is no function truly empowered or accountable for the customer.

Companies have chosen various organizational approaches to attempt to solve this operational dysfunction. Some have appointed a Chief Customer Experience Officer or a Chief Customer Officer. These executives typically have either an organizational responsibility or an advisory role. In the former case, the “customer owner” who has staffing and budgetary resources is much better equipped to drive the organizational changes required to effect meaningful change in optimizing the customer experience. Without direct control over people, budgets and systems, the gravitational pull of functional imperatives to – for example – do solely what’s right for sales, IT, finance, marketing, or service, adhering to an “inside-out” company goal directive (vs. an “outside-in” customer-centric approach) inhibits the advisory executive’s impact on customer experience and success. 

As Solis states, “each group inadvertently contributes to a disconnected approach to CX because they’re attempting to solve one part of the customer’s journey and experience from their silo. Yet, customers don’t see departments, they see one brand.”  He defines CX this way: “it’s the sum of all engagements a customer has with your brand in every touchpoint, in each moment of truth, throughout the customer lifecycle.” I think he has it exactly right.  When I co-authored the book PowerBranding, I wrote: “every contact matters – in some way, it either enhances or diminishes your brand,” so you need to manage every customer contact, and yet very few companies actually get this concept, let alone establish operational processes to ensure customer experience is optimized everywhere.

What’s a company to do when there is no designated companywide customer owner or centralized customer experience function?  It is still possible to begin the journey toward customer centricity by aligning objectives across several functions, such as sales, marketing, and services. Someone needs to at least own the cross-functional responsibility to ensure that goals and benchmarks are established and regular measurements (such as NPS, renewal rates, satisfaction levels and other continuous tracking) are taken to record progress against the stated objectives. Since marketing owns key elements of the customer experience (e.g., communications, web, social media, customer engagement and loyalty programs) at many companies, it is a likely group to lead the effort to align functional groups and drive initiatives to integrate customer experience management. Customer ownership is a journey that has to start somewhere, and it often starts with one person raising their hand to tackle the inherent structural impediments of functional organizations vs. customer centric ones.

Who’s game to take this on?

Phil Wasson

Silo busting with enterprise imaging

Healthcare

A new infographic contains this startling statistic: Healthcare data will reach 15 zetabytes by 2020 which is equivalent to four times the amount of information contained on the Internet. That’s a lot of Facebook posts about what’s for dinner.  With the advent of new technologies like genomics and 3D imaging, plus the aging population in the U.S., healthcare data is going to continue to grow exponentially.  

Managing all this healthcare data is especially challenging because not only does a majority of it live outside the electronic health record (EHR), but it also can’t be well managed by your traditional PACS. A 2013 IHS VNA study states by 2017, 75% of healthcare data will be in the form of non-DICOM medical imaging assets. Think about that for a minute. 75% of your patients’ information won’t be stored in any of the systems you use most today. By non-DICOM images, they mean the photos stored on a clinician’s phone, video clips created by specialty departments such as ophthalmology and the images created by isolated modality stations.

These non-DICOM medical images are currently trapped in isolated departmental silos. Because these images are often controlled by specialty applications, point-of-care clinicians have little to no access to these images as part of their patient evaluation processes. Managing these non-DICOM images in your PACS system is also not a good option. To do so, you would have to convert all non-DICOM images to DICOM. Or you would have to pay your PACS vendor a hefty fee for ingesting non-DICOM images into the PACS. Plus, if you managed to get the images in there it would add a significant image volume to the inevitable data migration when your replace your PACS. 

Despite these challenges, bringing all this diverse patient information together is vital to a healthcare organization’s ability to grow and evolve. A recent survey by CHIME found, 54% of organizations cannot exchange medical images with recipients outside their four walls. This is a problem. With the emergence of initiatives like Meaningful Use, the Affordable Care Act (ACA), value-based reimbursement and Accountable Care Organizations (ACOs), healthcare organizations must put a premium on interoperability.

If you don’t know an image exists or don’t know where it is stored, it’s going to be difficult to share it with a referring physician, another specialist for collaboration or a facility involved with that patient’s continuum of care. 

In addition, not having access to an imaging study will drive up costs, especially as value-based reimbursement models take hold and imaging departments move from being revenue centers to cost centers.

Your organization must be able to connect discrete patient information from the EHR, DICOM images from the PACS and non-DICOM images from specialty departments and deliver it all in a single view. Increasingly, organizations are turning towards an enterprise imaging strategy to bust up these image silos and reach the lofty goal of a comprehensive patient picture.

Enterprise imaging enables more informed clinical decision-making and drives down costs by allowing healthcare organizations to capture, manage and view medical images, both DICOM and non-DICOM, at the clinical point of care and within the radiology and cardiology departments. 

By consolidating imaging information throughout the enterprise into a single, standards-based repository that communicates seamlessly with all the IT systems involved, like the EHR and PACS, you can enhance patient outcomes, lower costs and ensure data security.

Early adopters of an enterprise imaging strategy are already seeing benefits:

  • Imaging Associates physicians now have access to full patient reports in just 2 hours
  • Florida Hospital radiologists are enjoying a 20% to 25% time-savings on their ultrasound reads
  • Piedmont Healthcare achieved an estimated savings of $2-$3M because they had the freedom to move away from their PACS vendor

Start preparing now for the tsunami of patient information coming your way, especially in the form of non-DICOM images. Get started by educating yourself on enterprise imaging and how early adopters are benefiting from this strategy.

Sarah Bajek

Visualize a new level of outperformance in finance

Accounting and finance

Imagine driving change that moves your business forward, without disrupting critical everyday financial operations. It’s possible and necessary as part of the changing landscape of business. The traditional CFO role is evolving from cost control and accounting, to a focus on financial transformation and digitization. In an effort to get more out of spreadsheet data and make finance an analytic dynamo for the organization, CFOs are tracking performance (particularly in accounts payable) and addressing process efficiency from a holistic view.

The right measures

AP’s performance is under constant scrutiny. But are CFOs using the right metrics? According to a recent Ardent Partners report, AP departments are most often measured on the volume of their activity and payment metrics. In fact, 75% use metrics like number of invoices processed or number of payments made, and 63% put more weight on payment metrics focused on accuracy. While these measurements offer a good idea of the department’s efficiency, they fall short on gauging the true value (or potential value) of AP to the organization.

According to Ardent Partners, only 36% of enterprises evaluate AP around financial metrics such as rebates and early payment discounts, and even fewer (31%) evaluate the department on process, regulatory or financial compliance. Opening up the evaluation of AP to more than just the traditional tactical metrics offers opportunity for greater insight into AP’s impact across the organization.

Break silos to outperform

Top performing companies are unlocking the value of AP data and extending those insights across the purchase-to-pay process chain to procurement.  In fact, these organizations are 77% more likely to leverage a complete automated procure-to-pay solution, according to Ardent Partners.

Through automated solutions, these performance champions are breaking down the barriers between purchasing and AP, making everyday operations more efficient, and turning financial data into actionable insights to outperform their peers.

Ardent Partners found that compared with their peers, best-in-class enterprises are 89% more likely to view AP as exceptionally or very valuable form a strategic perspective. They are also more likely to leverage intelligence from AP in the following areas:

  • 1.2-times more likely for forecasting, budgeting and planning
  • 2.3-times more likely to understand the impact of cash
  • 2.7-times more likely to develop better payment strategies

Are you tracking similar success metrics across your P2P cycle? Start today - Read our white paper for more secrets of performance champions.

Sarah Bajek

The big impact of AP data

Accounting and finance

Gone are the days when accounts payable (AP) was seen as a necessary evil. For many organizations the former marginalized, back-office function is rising to the top as a strategic part of the business thanks to the valuable intelligence gleaned from AP’s data. Intelligence from this data can support major tasks and critical decision-making, and top-performing organizations are leveraging it to evaluate and enhance performance at multiple levels throughout the organization.

According to a recent Ardent Partners report, 81% of finance teams identified their AP data as “critical” or “important” to their financial operations.

But, where is AP data making the biggest impact?

(This is where procurement should start paying attention.) 

1. Freeing up time for strategic activities

By improving the handling of invoice exceptions, AP staff is available to focus on larger financial transformation initiatives. This also leads to improved visibility into financial data and even insight into day-to-day activities, adding further value across finance. 

2. Enhancing collaboration with procurement

According to Ardent Partners, more than half of finance teams are using AP data to boost collaboration and the relationship between AP and procurement. 60% of respondents said procurement uses the enormous amount of AP data gathered to support supplier management work. Procurement can also harness data on invoices to monitor contract compliance and support supplier rationalization undertakings.

3. Planning ahead

Forecasting, budgeting and planning is another area where AP data can add further value for an organization. And still more opportunities exist for this intelligence to be leveraged in planning payment strategies and understanding the impact of cash on an organization.

It’s time to align

It’s proven that AP data can fuel strategic decision-making – but only when shared across other areas of finance, especially procurement, as actionable intelligence. Ardent Partners notes that a little less than half of finance teams surveyed hope to increase collaboration between AP and procurement over the next year. These pioneers recognize that linking these two parts of the purchase-to-pay process and providing data visibility across functional lines offers greater efficiency.  With the help of purchase-to-pay automation technology, these organizations gain deeper insight into spend and supplier performance and benefits extend outside the organization to suppliers from a single linked process.

Are you leveraging your AP data to the fullest? Read the full Ardent Partners ePayables 2016: Eyes on the Prize to learn more about positioning your organization for top performance.