• Skip to main content

Paragon Consulting Partners

Healthcare IT Solutions and Results

  • Home
  • Services
    • Healthcare Providers
    • Enterprise Imaging Vendors
    • Healthcare Investors
    • Analytics as a Managed Service
  • About
  • News & Blog
  • Resources
  • Contact
  • LinkedIn
  • Twitter
  • Facebook
  • YouTube

Data Management

November 7, 2019 By Laurie Lafleur 1 Comment

Why be limited by industry standards or 5G infrastructure? Share your images with anyone, anytime using sound-wave technology that transmits data at a frequency only dogs can hear.

Imagine what we could do if we could share massive volumes of data between hospital networks in a matter of milliseconds – regardless of their legal affiliation. Not only could we liberate patients’ complete medical and imaging records, making them truly portable from one healthcare facility to another, but we could also make newer services and technologies more feasible and reliable – from telemedicine visits to robot-performed telesurgery. 

Two big obstacles separated this coveted fiction from becoming a reality – trust (i.e. security, privacy, and identity management) and infrastructure (i.e. server processing power and high-speed networks). There was a lot of hype surrounding blockchain for addressing the former not all that long ago, which largely fell flat because of the latter. However, the emergence of 5G networks is bringing renewed promise for enabling economies of scale in the healthcare industry that were previously unimaginable.

So what is 5G anyway, and what does it mean for healthcare?

The next-generation in mobile communication technology, 5G uses ultra-high-frequency ‘air interfaces’ that have a much higher capacity than traditional wireless networks to transmit data and bring the promise of unprecedented transmission speed and bandwidth capacity with near-zero latency.

Quite simply, this means that 5G will allow healthcare organizations to connect more devices (think IoT), send more data – both in size and volume (think growing ageing populations and hi-res CTs), and facilitate true real-time communication between care providers and patients (think robot-performed telesurgery). 

This dam-breaking technology has the potential to truly support the flood of healthcare data that is being generated and continues to grow at an exponential rate. 5G is already rolling out in some major cities, and it is estimated to become mainstream by 2022 – the only problem is, healthcare won’t be able to realize the full benefit of this technology until we address the other elephant in the room – trust. 

Enter the Blockchain…  

Fundamentally, Blockchain provides a secure method of indexing and sharing information across a network of unaffiliated systems. As part of a distributed and collaborative ledger each system plays an integral role in recording and validating health transactions from across the network (the existence and location of patient images, for example). All systems in a Blockchain keep a comprehensive ledger of all transactions that have occurred across the network – such as updates to an electronic health record, or the addition of a new imaging exam. When information is requested from the network consensus across participating systems is required to confirm the accuracy of data being shared. This decentralized data stewardship model ensures that information is always synchronized across the chain and prevents any one person or system from being able to corrupt or take down the network. The benefits of this model are many, but to name a few:  

  1. Reuniting the patient record: One of the biggest interoperability challenges today is patient identity management. Because healthcare providers exist in their own technology silos patients are assigned unique identities for each facility or provider they visit – resulting in a fragmented patient record and care experience where technologies from the 90’s like CDs and fax machines are the primary method for information sharing. With Blockchain multiple patient identifiers from disparate organizations can be tracked and connected across the network, resulting in a truly unified and transparent medical record. 
  2. Achieving real economies of scale: Most healthcare organizations would argue that their technical infrastructure is expensive, often failing to deliver the promised ROI. Because Blockchain shares resources across the network it can realize real economies of scale – leveraging compute power across the network to facilitate faster and more efficient information exchange and leveraging network-wide volume to drive down individual transaction costs. 
  3. Facilitating faster, cheaper, and more reliable cross-entity transactions: Because a Blockchain can be shared across a variety of authorized providers in a secure and standardized way it has the potential to greatly reduce the cost and complexity of common transactions such as brokering referrals, accessing relevant medical history, communicating test results or incidental/critical findings, and managing billing and payment for network participants including but not limited to hospitals, specialists, primary care physicians, social service providers, and insurance payers, etc. And once again, because of that ledger and the inherent cryptographic protocols underlying Blockchain technology clinical and financial transactions are guaranteed to be accurate and secure.

Love the logo? Contact us for details on how to order your limited edition tee.

Are you ready for the next wave of digital transformation? We can help you separate truth from fiction and build a future-proof technology strategy that will best fit your organizational capabilities and needs. Contact us to setup a meeting at RSNA 2019.


Filed Under: 5G, Blockchain, Data Management, Healthcare IT

October 30, 2019 By Laurie Lafleur Leave a Comment

Beyond cloud technology, our highly secure data centres are hosted throughout the far reaches of outer space for maximum geographic coverage and performance at the speed of light.

It’s true – there are some big advantages to outsourcing your infrastructure and data management to a qualified ‘as-a-service’ provider. Cloud services have come a long way, offering highly scalable and secure infrastructure and data management solutions that can help operationalize budgets and reduce or eliminate the need for local hardware and IT services. However, there are pros and cons related to performance, flexibility, and cost that should be carefully considered when determining whether a move to the cloud is right for your organization, and if so, which models will deliver the best results and ROI:

  1. Differing service models: As if there weren’t enough acronyms in the healthcare world already – Cloud adds its own collection of ‘aaS’ (as-a-service) to the pile. It’s important to understand the primary service models offered by cloud vendors so you can choose the model that best fits your organizational needs. Infrastructure and Platform as-a-service (IaaS, PaaS) are more fundamental. IaaS provides the basic virtualized infrastructure for the storage and processing of data, whereas PaaS adds operating system, middleware, and runtime environments. Both can reduce the amount of physical hardware and IT expertise that is required on-premise. This model is great if your needs are focused on supporting data growth, data mining, or business continuity. Software-as-a-Service (SaaS) typically represents a more comprehensive service model, delivering subscription-based business solutions that are fully vendor hosted and managed. This fully operationalizes your IT budget, and greatly reduces your in-house IT burden, however the trade-off is typically a reduction in customization opportunities. Which model would work best for your organization entirely depends on the degree of control you wish to exercise over your own infrastructure, software configuration, and workflows.
  2. Maturity of your IT organization: The scale and maturity of your in-house infrastructure and depth of expertise held by your IT team are two key factors to consider when thinking about a move to the cloud. Have you made significant investments into your own data centres and leading-edge infrastructure? Do your operations require sophisticated and customized IT solutions to support complex and varied clinical and operational workflows? If the answer is no, cloud could certainly be a viable option. However, if you answered yes to one or both of these questions then a full-blown cloud solution may not be right for you. Instead, you may wish to consider a hybrid cloud solution that leverages your current investments and affords your IT team full autonomy over your system architecture and configuration while augmenting it with additional secure flexible storage and compute resources and replication alternatives. 
  3. Growth pace and predictability: Cloud environments are more dynamic and elastic than traditional on-premise hardware infrastructure – which is one if its biggest advantages. Cloud environments can flex and scale storage and compute resources to accommodate large and unpredictable swings in data and user volumes. If your organization’s infrastructure is struggling to keep pace with your growth or usage patterns, then cloud services might be right for you. However, factors such as location, available network bandwidth, and network latency can impact the performance of applications hosted in the cloud. As well, not all vendors have optimized their software solutions to operate efficiently in a cloud-hosted environment – which can not only effect performance but can also lead to surprisingly high service fees. For instance, ‘chatty’ applications that make superfluous round trips to cloud-based servers and databases can greatly increase network service fees, while their performance can be significantly and adversely impacted by network latency. Before boarding that rocket ship to the cloud, make sure you thoroughly vet your vendor’s technology architecture and test system performance and connectivity across representative use cases and locations to ensure it will meet or exceed your expectations.

Love the logo? Contact us for details on how to order your limited edition tee.

Are you in the market for a cloud-based imaging solution? We can help you separate truth from fiction and select a strategy, technology, and vendor that will best fit your organizational capabilities and needs. Contact us to setup a meeting at RSNA 2019.


Filed Under: Cloud, Data Management, Healthcare IT, Imaging

October 1, 2019 By Laurie Lafleur Leave a Comment

Accurately predict future events to mitigate disease, catastrophes, and natural disasters using tarot-based analytics powered by magic 8-ball technology.

It’s true, today’s clinical and business intelligence can deliver deep insights that can be used for disease mitigation and operational optimization. However, the ability of your analytics software and programs to deliver on lofty marketing promises is only as good as your underlying data strategy. Here are a few considerations to keep in mind before embarking on an analytics journey:

  1. You can’t measure what you don’t know: The first, and most important step in building a successful analytics program is understanding your current state environment, gaps, and challenges. Defining meaningful key performance indicators (KPIs) is not always simple but will enable you to measure improvement over time and ensure you’re tracking towards your organizational goals. 
  2. Your results are only as good as the quality of your data: Getting meaningful and actionable insights requires aggregation of data across many disparate and heterogeneous locations, systems, and formats. Be sure to have a flexible and scalable data model, a data normalization strategy in place, and carefully and regularly evaluate the quality, consistency, and integrity of your data to ensure accurate and consistent results over time. 
  3. The single biggest problem in communication is the illusion that it has taken place: Having scads of clinical and operational metrics is awesome, but is essentially useless unless you can deliver those insights to the right people, at the right time, and in a format that can be easily consumed and acted upon. This starts with having clear organizational objectives and fostering a data-driven culture where information is distilled and shared according to the communication methods that work best for each of your key stakeholder groups. 

Are you in the market for an enterprise analytics solution? We can help you separate truth from fiction and select a strategy, technology, and vendor that will best fit your organizational capabilities and needs. Contact us to setup a meeting at RSNA 2019.

Love the logo? Contact us for details on how to order your limited edition tee.

If you enjoyed this post subscribe to our blog to be notified when new articles are published.


Filed Under: Analytics, Data Management, Healthcare IT, Imaging

April 13, 2019 By Laurie Lafleur Leave a Comment

Much excitement in healthcare today revolves around the unlocked potential of population-level ‘big data’, which can be leveraged to inform diagnosis and treatment best practices, enable earlier intervention or proactive mitigation of disease, and support the development of new and innovative medical procedures and interventions. 

A luminary teaching and research organization, the National Institutes of Health (NIH) is an excellent example, conducting numerous research initiatives such as disease prevention and treatment, oncology, precision medicine, and neurotechnology, to name a few. Data normalization plays a significant role in supporting these initiatives by creating structured and consistent datasets to that can be leveraged by data mining technologies that enable the processing and analysis of the significant volumes of information required to perform this type of population-level research – a task that would insurmountable if not automated.

Cleaning and Screening

At NIH data analysis begins with the screening and selection of research patients who meet the specific protocol requirements for various ongoing clinical studies. This involves collection and evaluation of patients’ complete medical records including family and medical history, EMR data, imaging records, and much more to identify relevant genetic characteristics, demographic factors, disease profiles, and health statuses, etc. 

The NIH screens thousands of patient candidates, and as such have sophisticated methods to collate, standardize, and analyze the aforementioned data. First, patient identifiers are normalized upon ingestion and a temporary ‘pre-admit’ MRN is assigned to ensure consistency across diverse data sets and facilitate longitudinal analysis of the complete patient record by NIH systems and researchers throughout the screening process. This also ensures candidate data is kept separate from approved, active research datasets until the patient has been officially selected – at which time the MRN is normalized once again to an active ‘admitted’ profile.

The Role of Diagnostic Imaging

Imaging data is a key component to the research performed at the NIH. As such, researchers collect patients’ previous imaging from various outside facilities. Key imaging attributes on imported studies, such as study and series descriptors, are normalized according to NIH policies to enable automated data analysis and simplify the screening process for researchers by ensuring exams hang reliably within their diagnostic viewers for quick and efficient review. 

As well, the NIH often performs additional advanced imaging exams throughout clinical research projects. To ensure this newly-generated data accurately correlates with the patient’s prior imaging history, can be easily inspected by advanced data analysis technologies, and enables efficient review workflows for researchers the NIH also enforces normalization policies at the modality level. 

Because the NIH works with a large number of trainees and fellows the aforementioned normalization of diagnostic imaging exams provides the added benefit of creating a consistent educational environment for teachers and students, supporting structured and replicable teaching workflows.

Leveraging and Developing Artificial Intelligence and Machine Learning

Artificial Intelligence (AI) and machine learning (ML) may be new entrants into mainstream clinical settings, but to the NIH they have been in use for many years already and represent foundational technologies that play a significant role in enabling automatic identification and comparison key data elements across hundreds or thousands of relevant research studies, which would not be possible if done manually. 

However, there are some data elements that cannot yet be fully analyzed by these technologies. For those where bias or variability between reported values or criteria can exist, manual intervention is still required to make appropriate adjustments based on interpretation of the context within which the values were taken. This is the tricky thing about data normalization – it’s often context sensitive and the level of reasoning required to consider the ultimate research question or goal when evaluating data relevance and normalization needs cannot yet be reliably accomplished by machines. 

For this reason, the NIH continues to support the development and refinement of AI and ML algorithms by leveraging their enormous collection of clean and consistent data to build diverse training environments and collaborating with luminary AI and ML technology organizations to support the further development of advanced context-aware data analysis and clinical use cases. 

Anonymization/De-identification

Another critical aspect of the research performed at the NIH is the anonymization and/or de-identification of personally identifiable information (PII).  In order to adhere to patient privacy and human subjects research protection regulations, at times, the NIH wishes to, or is required to de-identify research data by removing or obfuscating identifiable data that could otherwise reveal a patient’s identity.

This might be done to allow the NIH researchers to conduct secondary analyses without additional human subjects’ protections approvals (per 45 C.F.R. 46) or in order to share data with other research collaborators. The NIH accomplishes this through standard de-identification or coding techniques and normalization policies with the goal of scrubbing data to remove identifiers.  

However, the NIH employs a specialized technique that ensures a link persists between the patient’s anonymized/de-identified and identifiable data through a process defined as an ‘Honest Broker’. This ensures researchers have the ability to follow-up with patients for care-related issues and/or can continue to follow outside patients for additional or future research purposes with appropriate safeguards and approvals.

The real value of data normalization

From tracking tumor response to new treatment programs to developing statistical models for a population’s health characteristics and risk profiles, data curation on the scale achieved by the NIH not only requires the collection of vast amounts of scattered information for analysis, but also mechanisms to transform and normalize incoming data to create well-defined and comparable datasets. For the NIH, data normalization enables the extraction of valuable and predictive clinical insights that can be used to improve both clinical outcomes and population-wide health. 

If you enjoyed this post subscribe to our blog to be notified when new articles are published.


Filed Under: Analytics, Data Management, Healthcare IT, Imaging, Uncategorized Tagged With: Clinical Research, data normalization, Enterprise Imaging, health data, Health IT, healthcare data, healthcare IT, HealthIT, medical imaging data, radiology data

February 4, 2019 By Laurie Lafleur Leave a Comment

Data normalization is an essential activity whenever two or more previously independent facilities are migrated into single, consolidated data repository, as it preserves the integrity of incoming data by avoiding broken links between patient records and problematic collisions between device and study identifiers. 

Jude Mosley, Manager of Information Technology, Imaging at Sentara Healthcare knows this better than anyone. She and her team face this very challenge regularly whenever a new hospital or imaging center is integrated into the 5-hospital network in Norfolk, Virginia. 

Hey, that’s my jacket!

The first challenge is ensuring patient jackets from differing systems are reconciled correctly so that no information was lost or attached erroneously to the wrong patient. Using a master patient index (MPI), Sentara is able to match patients between the source and destination systems and update all incoming studies with the correct patient identifier upon ingestion. This results in a single, consistent set of MRNs across the Sentara network, and eliminates future risk that an old MRN could be mistakenly used. 

The next challenge is ensuring study accession numbers remain unique, and that the migration process doesn’t introduce duplicate accession numbers for different patients. To accomplish this, Sentara employed normalization policies that pre-fixed the accession number and updated the location attribute to a 3-character code representing the source facility for all incoming studies. Not only did this avoid potential collisions between accession numbers, it also added a level of transparency to the migrated data that allowed Sentara to quickly and easily identify its original source. 

The Big Bang theory of modalities

While the migration process provides an excellent opportunity to normalize data on-the-fly, it is often necessary to enforce data normalization policies at the modalities themselves. At Sentara, normalization at the modality level is a well-defined process. With each new modality introduction, update, or site acquisition Sentara ensures that every affected modality adheres to a consistent set of pre-defined policies regarding AE title and station naming conventions. When new modalities are installed, or vendor updates are applied, Sentara requires the vendors to be aware of and sign off on the required data policies to ensure they’re properly applied and thoroughly tested after the installation is complete to maintain adherence.

For larger-scale activities, like a new site acquisition and migration, the PACS team prepares a comprehensive list of all modalities and their new AE titles, station names, and IP addresses, and orchestrates a big-bang update process. While this is no small undertaking, through experience Sentara has refined this process to run like a well-oiled machine. Once complete, Sentara has improved the consistency and supportability of their now-larger infrastructure, and once again ensured that data arrives in a predictable and consistent manner. 

A shining example of normalcy

This case provides excellent examples of how data normalization can address the integration challenges faced by expanding health systems. Not only does it mitigate risk by avoiding data loss and collisions during the migration process, it also measurably improves data quality and reduces support costs in the future by improving the consistency and predictability of systems and data across the entire network.

Our next blog will be taking a more clinical focus, looking at how data normalization can be leveraged to uncover deep insights across population-wide data sources. Sound interesting? Subscribe to our blog to be notified when the next post becomes available!


Filed Under: Data Management Tagged With: data migration, data normalization, Enterprise Imaging, health data, Health IT, healthcare data, healthcare IT, HealthIT

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Copyright © 2023 · Paragon Consulting Partners, LLC · 500 Capitol Mall, Suite 2350, Sacramento, CA 95814 | 916-382-8934