• Skip to main content

Paragon Consulting Partners

Healthcare IT Solutions and Results

  • Home
  • Services
    • Healthcare Providers
    • Enterprise Imaging Vendors
    • Healthcare Investors
    • Analytics as a Managed Service
  • About
  • News & Blog
  • Resources
  • Contact
  • LinkedIn
  • Twitter
  • Facebook
  • YouTube

HealthIT

April 13, 2019 By Laurie Lafleur Leave a Comment

Much excitement in healthcare today revolves around the unlocked potential of population-level ‘big data’, which can be leveraged to inform diagnosis and treatment best practices, enable earlier intervention or proactive mitigation of disease, and support the development of new and innovative medical procedures and interventions. 

A luminary teaching and research organization, the National Institutes of Health (NIH) is an excellent example, conducting numerous research initiatives such as disease prevention and treatment, oncology, precision medicine, and neurotechnology, to name a few. Data normalization plays a significant role in supporting these initiatives by creating structured and consistent datasets to that can be leveraged by data mining technologies that enable the processing and analysis of the significant volumes of information required to perform this type of population-level research – a task that would insurmountable if not automated.

Cleaning and Screening

At NIH data analysis begins with the screening and selection of research patients who meet the specific protocol requirements for various ongoing clinical studies. This involves collection and evaluation of patients’ complete medical records including family and medical history, EMR data, imaging records, and much more to identify relevant genetic characteristics, demographic factors, disease profiles, and health statuses, etc. 

The NIH screens thousands of patient candidates, and as such have sophisticated methods to collate, standardize, and analyze the aforementioned data. First, patient identifiers are normalized upon ingestion and a temporary ‘pre-admit’ MRN is assigned to ensure consistency across diverse data sets and facilitate longitudinal analysis of the complete patient record by NIH systems and researchers throughout the screening process. This also ensures candidate data is kept separate from approved, active research datasets until the patient has been officially selected – at which time the MRN is normalized once again to an active ‘admitted’ profile.

The Role of Diagnostic Imaging

Imaging data is a key component to the research performed at the NIH. As such, researchers collect patients’ previous imaging from various outside facilities. Key imaging attributes on imported studies, such as study and series descriptors, are normalized according to NIH policies to enable automated data analysis and simplify the screening process for researchers by ensuring exams hang reliably within their diagnostic viewers for quick and efficient review. 

As well, the NIH often performs additional advanced imaging exams throughout clinical research projects. To ensure this newly-generated data accurately correlates with the patient’s prior imaging history, can be easily inspected by advanced data analysis technologies, and enables efficient review workflows for researchers the NIH also enforces normalization policies at the modality level. 

Because the NIH works with a large number of trainees and fellows the aforementioned normalization of diagnostic imaging exams provides the added benefit of creating a consistent educational environment for teachers and students, supporting structured and replicable teaching workflows.

Leveraging and Developing Artificial Intelligence and Machine Learning

Artificial Intelligence (AI) and machine learning (ML) may be new entrants into mainstream clinical settings, but to the NIH they have been in use for many years already and represent foundational technologies that play a significant role in enabling automatic identification and comparison key data elements across hundreds or thousands of relevant research studies, which would not be possible if done manually. 

However, there are some data elements that cannot yet be fully analyzed by these technologies. For those where bias or variability between reported values or criteria can exist, manual intervention is still required to make appropriate adjustments based on interpretation of the context within which the values were taken. This is the tricky thing about data normalization – it’s often context sensitive and the level of reasoning required to consider the ultimate research question or goal when evaluating data relevance and normalization needs cannot yet be reliably accomplished by machines. 

For this reason, the NIH continues to support the development and refinement of AI and ML algorithms by leveraging their enormous collection of clean and consistent data to build diverse training environments and collaborating with luminary AI and ML technology organizations to support the further development of advanced context-aware data analysis and clinical use cases. 

Anonymization/De-identification

Another critical aspect of the research performed at the NIH is the anonymization and/or de-identification of personally identifiable information (PII).  In order to adhere to patient privacy and human subjects research protection regulations, at times, the NIH wishes to, or is required to de-identify research data by removing or obfuscating identifiable data that could otherwise reveal a patient’s identity.

This might be done to allow the NIH researchers to conduct secondary analyses without additional human subjects’ protections approvals (per 45 C.F.R. 46) or in order to share data with other research collaborators. The NIH accomplishes this through standard de-identification or coding techniques and normalization policies with the goal of scrubbing data to remove identifiers.  

However, the NIH employs a specialized technique that ensures a link persists between the patient’s anonymized/de-identified and identifiable data through a process defined as an ‘Honest Broker’. This ensures researchers have the ability to follow-up with patients for care-related issues and/or can continue to follow outside patients for additional or future research purposes with appropriate safeguards and approvals.

The real value of data normalization

From tracking tumor response to new treatment programs to developing statistical models for a population’s health characteristics and risk profiles, data curation on the scale achieved by the NIH not only requires the collection of vast amounts of scattered information for analysis, but also mechanisms to transform and normalize incoming data to create well-defined and comparable datasets. For the NIH, data normalization enables the extraction of valuable and predictive clinical insights that can be used to improve both clinical outcomes and population-wide health. 

If you enjoyed this post subscribe to our blog to be notified when new articles are published.


Filed Under: Analytics, Data Management, Healthcare IT, Imaging, Uncategorized Tagged With: Clinical Research, data normalization, Enterprise Imaging, health data, Health IT, healthcare data, healthcare IT, HealthIT, medical imaging data, radiology data

February 4, 2019 By Laurie Lafleur Leave a Comment

Data normalization is an essential activity whenever two or more previously independent facilities are migrated into single, consolidated data repository, as it preserves the integrity of incoming data by avoiding broken links between patient records and problematic collisions between device and study identifiers. 

Jude Mosley, Manager of Information Technology, Imaging at Sentara Healthcare knows this better than anyone. She and her team face this very challenge regularly whenever a new hospital or imaging center is integrated into the 5-hospital network in Norfolk, Virginia. 

Hey, that’s my jacket!

The first challenge is ensuring patient jackets from differing systems are reconciled correctly so that no information was lost or attached erroneously to the wrong patient. Using a master patient index (MPI), Sentara is able to match patients between the source and destination systems and update all incoming studies with the correct patient identifier upon ingestion. This results in a single, consistent set of MRNs across the Sentara network, and eliminates future risk that an old MRN could be mistakenly used. 

The next challenge is ensuring study accession numbers remain unique, and that the migration process doesn’t introduce duplicate accession numbers for different patients. To accomplish this, Sentara employed normalization policies that pre-fixed the accession number and updated the location attribute to a 3-character code representing the source facility for all incoming studies. Not only did this avoid potential collisions between accession numbers, it also added a level of transparency to the migrated data that allowed Sentara to quickly and easily identify its original source. 

The Big Bang theory of modalities

While the migration process provides an excellent opportunity to normalize data on-the-fly, it is often necessary to enforce data normalization policies at the modalities themselves. At Sentara, normalization at the modality level is a well-defined process. With each new modality introduction, update, or site acquisition Sentara ensures that every affected modality adheres to a consistent set of pre-defined policies regarding AE title and station naming conventions. When new modalities are installed, or vendor updates are applied, Sentara requires the vendors to be aware of and sign off on the required data policies to ensure they’re properly applied and thoroughly tested after the installation is complete to maintain adherence.

For larger-scale activities, like a new site acquisition and migration, the PACS team prepares a comprehensive list of all modalities and their new AE titles, station names, and IP addresses, and orchestrates a big-bang update process. While this is no small undertaking, through experience Sentara has refined this process to run like a well-oiled machine. Once complete, Sentara has improved the consistency and supportability of their now-larger infrastructure, and once again ensured that data arrives in a predictable and consistent manner. 

A shining example of normalcy

This case provides excellent examples of how data normalization can address the integration challenges faced by expanding health systems. Not only does it mitigate risk by avoiding data loss and collisions during the migration process, it also measurably improves data quality and reduces support costs in the future by improving the consistency and predictability of systems and data across the entire network.

Our next blog will be taking a more clinical focus, looking at how data normalization can be leveraged to uncover deep insights across population-wide data sources. Sound interesting? Subscribe to our blog to be notified when the next post becomes available!


Filed Under: Data Management Tagged With: data migration, data normalization, Enterprise Imaging, health data, Health IT, healthcare data, healthcare IT, HealthIT

August 16, 2018 By Laurie Lafleur Leave a Comment

A few years ago we signed up for what promised to be an exciting adventure: Extreme Fitness Vacation in the Grand Canyon: Hike from the Rim to the River and Back in One Day! Sounds fun, right? If you think so, or if you’re just curious what this has to do with project management, read on!

This excursion was led by a gym owner, manager, personal trainer, and beefcake in his own right. The trip was seemingly well planned; our routes were outlined and our host promised to take care of all the incidentals – including a relaxing lunch at the bottom of the canyon before beginning the arduous hike back up. What could possibly go wrong? Unfortunately, lots.

To start, it turns out this particular hike (or at least the upwards part of the journey) is actually noted to be one of the 10 most dangerous hikes in North America – not for the faint of heart, and certainly not for the faint-of-fitness. This little tidbit was not known to our excursion leader, who himself was somewhat of a hiking newb, and thus there were no clear fitness or hiking experience guidelines or requirements laid out for those who were interested in signing up. All that was required, as it turned out, was the ability to pay the less-than-modest fee. This resulted in a rag-tag group of pleasant, yet ill-equipped and ill-informed individuals joining the excursion.

The temperature in the Canyon at that time of year (June) often exceeds 110F (or 43C for my Canadian friends), which means a few things: first, you’ll need lots of water and electrolytes (critical to muscle function) because you’re going to sweat. A lot. Second, you won’t feel hungry during the hike, but trust me, your body will be. This means you need to bring lots of light snacks to eat along the way – even when the last thing you feel like doing is eating – to ensure your energy levels are maintained to support your level of physical exertion.

To make a long story short, because of his lack of experience and knowledge of the physical and environmental demands of a Grand Canyon hike, our host made a few critical errors:

  1. He did not bring adequate or appropriate snacks for the number of people in his group (seriously ‘Arnold’s extreme protein muscle fuel bars’ are really the last thing you want to be trying to digest under these conditions)
  2. He assumed there was water along the trail – which would have been great because it meant we wouldn’t have to carry it ourselves! However, water was only available halfway up the return trail, so for 75% of the hike there were no viable re-fill options for empty bottles and thus many among us ran out of water…quickly.
  3. He failed to provide or educate participants on the importance of electrolytes, which are lost with sweat and must be replenished to ensure your leg muscles function well and long enough to haul your butt out of the canyon (as many-a-sign said, down is optional – up is mandatory)
  4. Lunch was promised at the bottom of the canyon, but he didn’t pack anything as he expected he could purchase sandwiches at a river side cafe. The problem? There was no such cafe, and nowhere to purchase sandwiches (unless you want to travel another mile or so off-route to the famed Phantom Ranch, and you’ve booked ahead…)

The result? Heat exhaustion, severe muscle cramping, and even delirium were experienced by some. I’m happy to report we all made it back out without serious injuries – no thanks to our leader, who himself could barely walk due to leg cramps near the end.

So what does hiking the Grand Canyon and Healthcare IT project management have in common? Simple. In this example our fearless (to a fault) leader acted as the project manager. He defined the scope of the excursion, the paths we would take, and the schedule we would adhere to. He was an expert in health and fitness, so the participants trusted his judgement and his plan – and thus followed him on what turned out to be a dangerous and misguided adventure. What he wasn’t, however, was an expert on the specific challenges presented by a Grand Canyon hike and therefore was ill-equipped to identify and navigate the unique terrain and conditions the group encountered.

Proving that even the best laid plans go awry, this (albeit extreme) story provides the perfect parallel for healthcare IT project management. Certified project management professionals (PMPs) are intimately aware of all associated best practices and methodologies required to steer general project initiative and are an important part of any project team. However, unless they also have the subject matter expertise that comes from hands-on experience in clinical and technical healthcare settings they are not equipped with the knowledge or tools needed to craft the most efficient and effective plan or steer the project flawlessly to success.

In addition to knowing how to run a project a PMP who is also a true subject matter expert brings immeasurable added value by additionally understanding the needs of clinical stakeholders, how data flows and systems interact, and knowing what pitfalls may lie ahead – thus allowing them to take proactive steps to help ensure their healthcare IT projects run smoothly, and are delivered on time and within budget.

For more information on our health IT project management methodology download the Six Steps For Effective Health IT Project Management Infographic.

If you enjoyed this post subscribe to our blog to be notified when we post new articles.

 

 

Filed Under: Imaging, Project Management Tagged With: Enterprise Imaging, Health IT Implementation, healthcare IT, HealthIT, imaging informatics, Program Management, Project Management

April 24, 2018 By Laurie Lafleur 1 Comment

‘Enterprise imaging’ is one of the most popular topics within healthcare IT circles today, and it’s no surprise why as within these two simple words lies a lot of potential. Enterprise imaging promises to tear down the silos that traditionally existed between disparate healthcare providers, departments, and facilities. It promises to create a truly unified patient record that is centrally managed and accessible by care providers and patients alike, in real-time. And, it promises to do all this while reducing costs and improving operational efficiency.

Such a state would finally allow patient records to be truly portable, enabling patients to assume ownership of their medical records and empowering them with the flexibility and freedom of choice to navigate the healthcare system with greater ease. As well, care providers would be equipped with critical insights into their patients’ medical conditions and history directly at the point-of-care, allowing them to make timely and informed decisions that ultimately lead to better clinical outcomes and preventative care initiatives.

So why then is Enterprise Imaging still a topic of discussion, and not yet a widely adopted reality?

While the vision is straightforward, the path there is considerably less-so. Every healthcare organization is different, every vendor is different, and it seems that most of the time there are more questions than answers. Should we go with a single vendor that offers a consolidated platform, or pursue a ‘best-of-breed’ strategy that allows us to pick and choose my favourite solutions? Should we update or replace our systems all at once, or take a more progressive approach that leverages our existing infrastructure? Should we host solutions ourselves, or consider a cloud-based and vendor-managed alternative?

Successfully deploying a centrally-managed enterprise platform that intelligently aggregates diverse imaging data from across the care continuum has broad implications and requires a carefully orchestrated approach to solution design that includes a thoughtful analysis of your current situation inline with your unique near and long-term goals.

Our Six Tenets of Enterprise Imaging provides a roadmap that can methodically guide you through the process of developing a complete Enterprise Imaging strategy that is tailored to your unique needs. While it may not provide all the answers up front, it breaks down the key components for success into manageable and actionable steps that consider your workflows, resources, and budget:

  1. Governance: ensures you have the people and processes in place to make informed and timely decisions
  2. Enterprise Platform: addresses the technologies required to build a scalable and performant foundation
  3. Workflow: enables efficient task orchestration and communication between providers and systems
  4. Visualization: delivers unencumbered access to images and tools to care providers across the care continuum
  5. Exchange: allows secure sharing of critical patient information with outside provider or payer networks
  6. Analytics: unlocks the clinical and business insights hidden across your enterprise

To see the big picture, download our infographic. To learn more about each tenet subscribe to our blog, where we will be discussing the key considerations and implications of each in turn.

 

Filed Under: Imaging Tagged With: Enterprise Imaging, Healthcare, healthcare IT, HealthIT, imaging informatics, PACS replacement, VNA

May 22, 2017 By Jef Williams Leave a Comment

As we slide toward RSNA, there is growing interest in Enterprise Imaging. We saw a groundswell of activity related to sessions, roundtables, vendor narratives, and provider interest at SIIM just a few months back. We are clearly in a phase of growing momentum in achieving better outcomes, cheaper, more efficiently, more carefully, and with a long view toward future success. Including imaging as part of the patient jacket has always been top of mind with those of us who engage primarily with imaging service lines, but it now becoming important to those in leadership for several reasons. First, it is the completion of the work with adopting EMR – adding all patient information to the patient jacket in a single platform or portal. Secondly, the sheer cost and complexity of imaging requires adopting newer technologies and innovations to achieve better business models. Third, policy is driving change in how we are, and will be, reimbursed; sharpening our data management models within imaging require better solutions. And finally, patient-driven care is rapidly approaching the point where it will bend the curve on business strategy and volumes.

The success of this initiative will rest largely on the comprehensiveness of the organization’s self-awareness, the empowerment of a healthy governance structure, and the willingness to learn and adapt interactively throughout the project lifecycle. Success is no longer built on the technology platform or vendor of choice, albeit this is certainly a factor. Consider that few, if any organizations are still following their original imaging roadmaps. This is due to many different forces including mergers, acquisitions, vendor changes, policy modifications (MACRA), market changes and technology innovation. Carefully adapting to these shifts in an unstable environment means spending enough time on strategy, goals, outcomes and philosophy. These foundations serve as guiding principles and indicators of the ongoing success of an enterprise imaging initiative.

Finally, it is often said that everyone is approaching EI differently. Yes, this is true. But there are many things systems are doing similarly. We have standards, and we all deal with the challenges associated with proprietary formats, proprietary tags, immature IHE profiles, integration workarounds, and supplementing solutions with peripheral technologies and workflow. There is much we can learn from each other’s experiences. This exchange of ideas is of growing value as demonstrated by the level of involvement in the HIMSS/SIIM Enterprise Imaging Workgroup as well as the growth in attendance at imaging shows as well as the breadth of session topics. We do well to avoid common mistakes. While we cannot “copy and paste” someone else’s specific strategy or roadmap to our own ecosystem, there are many lessons we can learn from each other. Moving our industry away from an “Us versus them,” mentality to a collaborative system of shared experiences will not only assist with greater local success, but ultimately reduce costs and risks associated with remediation of bad implementations.

 

Filed Under: Imaging Tagged With: Healthcare, HealthIT, RSNA16, SIIM, womenshealth

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Copyright © 2023 · Paragon Consulting Partners, LLC · 500 Capitol Mall, Suite 2350, Sacramento, CA 95814 | 916-382-8934