• Skip to main content

Paragon Consulting Partners

Healthcare IT Solutions and Results

  • Home
  • Services
    • Healthcare Providers
    • Enterprise Imaging Vendors
    • Healthcare Investors
    • Analytics as a Managed Service
  • About
  • News & Blog
  • Resources
  • Contact
  • LinkedIn
  • Twitter
  • Facebook
  • YouTube

Enterprise Imaging

June 14, 2022 By Laurie Lafleur Leave a Comment

From the initiation phase, to closeout, every project brings its own unique set of challenges. Each situation is dynamic and the needs for a particular project can never be addressed by a single “one size fits all” solution.

The experience of several Strings by Paragon projects, working with clients from start to finish, has offered the opportunity to obtain a different perspective into the future of the implementation approach. As expected, each client and project yielded a variety of experiences and feedback, enabling the overall solution and delivery to be further optimized and innovated for future initiatives.

Problem Statement (The Why)

With fiscal challenges and competing initiatives continuing to put pressure on budgets, the cost-benefit analysis into any new purchase of Enterprise Imaging solutions is now more important than ever. As previously mentioned, the motto “do more with less” is the consistent tune sung by the vast majority of our clients. Understanding this, Strings by Paragon implementations have ensured that every problem statement produces a measurable outcome, and has also opened new avenues for advanced application and workflow monitoring and operational reporting.

Our clients begin with a common list of gaps, some, or all of which you may be familiar with. Say,

  • Application issues that are identified too late
  • Questions like “why do we need to work with the vendor to identify the root cause?”
  • Having too many different vendors, and none of them are working together
  • An inability to access real-time or proactive reports/alerts to assist in organizational planning and troubleshooting
  • A general lack of understanding of how integrated applications work together from both a technical and workflow perspective

These gaps are exacerbated when coupled with the challenge of understaffed teams juggling and prioritizing their already double-booked schedules. Outside the main duties, managing support and project responsibilities are causing an increase in application administrator burn-out, which is an ever increasing strain on the mental well being of employees and a frequent contributor to high turnover rates. So, how can we combat this?  

The Process of Strings Engagement (The How)

Enterprise Imaging application administrators need tools that are intuitive and practical. Tools that will reduce their initial learning curve, and simplify their ongoing utilization. Strings by Paragon offers immediate resolution to these common use cases and problem statements. The combined experience of technical, clinical, and operational experts creates fluid discovery and design sessions focusing on the immediate success factors. The goal is to ensure that system administrators find Strings by Paragon as useful and easy to use as their phones in the early stages. As they become more sophisticated users, we want to keep administrators engaged with the application by expanding its functionality to proactively mitigate risks with their complex portfolio of applications. This is accomplished through:

  1. Emphasizing the importance of capturing relevant KPIs and implementing practical alerting and reporting: The main objective is to focus on KPIs that will deliver immediate results and create valuable outcomes. In addition, the implementation phase is focused on empowering administrative users to understand their complex environments on a deeper level by leveraging Strings to become an informed strategist within the organization. Understanding organization data, and having informed system administrators, places the power back on the organization and all systems at the highest level of performance.
  2. Working collaboratively with client administrators to define and refine KPIs and alerts: As the core KPIs are defined and implemented, there is a significant focus on the period evaluation of KPIs, alerts, and respective reporting. Organizations are constantly undergoing change, and applications with respective monitoring should be aligned accordingly. As such, Strings experts work together with the client administrators to ensure that KPIs, alerts, and reporting are tweaked on a regular basis. In addition, there are weekly collaboration sessions with the client to review alerts and establish an actionable plan to identify root cause and rectify any issues. These measures ensure the optimal value is being delivered to, and realized by the client.
  3. Operationalizing performance monitoring so that it becomes a valuable and ingrained routine: The general philosophy with implementation is to establish Strings by Paragon as an integral daily tool for application performance monitoring issues, and real-time operational reporting. From years of application implementation experience, it’s been proven that if a tool is not used on a regular basis, it loses its value and becomes another maintenance task in the already busy schedule, rather than alleviating some of these constraints through streamlined efficiency.

Post-Strings Results/Outcomes (The What)

To maximize the services that Strings by Paragon has to offer, customers and Paragon subject matter experts are continually identifying and fine-tuning Key Performance Indicators (KPIs). Through daily use, and establishing a weekly reporting cadence with customers, the services that Strings by Paragon offers are maximized.

For example, as data is continually being collected, analyzed, and banded, operational end-users can accurately identify why application performance indicators are not maximizing their clinical workflow needs. More specifically, technical teams can better understand what services are being under or over utilized. From there, proactive decisions can be made to prioritize one application over another.

With that knowledge, users can proactively engage in conversations to strategize how long-term improvements can be made. Strings helps clinical support teams to identify how their software applications consume hardware resources that they run on. With the valuable data that Strings delivers, the relevant management teams can communicate to the necessary technology teams on why and how performance values calculated by Strings are not fully supporting the various clinical workflow needs.

All in all, customers that have fully adopted Strings by Paragon have indicated that they are now able to identify, understand root causes, and troubleshoot problems before their clinical workflows are negatively impacted. In addition, proactive measures can be made to improve departmental productivity and efficiency.

Interested in learning more about how Strings by Paragon delivers analytics as a managed service? Subscribe to our blog to learn more.


Filed Under: Uncategorized Tagged With: analytics, analytics-as-a-managed-service, application monitoring, Enterprise Imaging, enterprise imaging analytics, healthcare analytics, key performance indicators, performance monitoring

April 28, 2022 By Laurie Lafleur Leave a Comment

Paragon subject matter experts have served healthcare industry for over 30 years and bring more than 200 years of collective experience. Working with various healthcare providers and implementation methodologies, the Paragon team has identified that smooth software adoption with a focus on continuous improvement is one of the most significant gaps in the industry. In response, Strings professional services were designed to break the status quo by delivering a new, more collaborative approach to software implementation, adoption, and optimization.

Overcoming Analysis Paralysis

In an Enterprise Imaging project there may be hundreds of Key Performance Indicators (KPIs) identified; however, only a handful are deemed relevant and actionable for daily monitoring and alerting. There are known risks with analytics solution adoption when an overwhelming amount of data and reporting is flooded to the User Interface (UI), especially in the early stages. Analytics solutions are only as effective as the user’s ability to comprehend the data and apply it in daily alerting/reporting.

This is why seasoned Paragon experts collaborate closely with key stakeholders to identify the most relevant initial KPIs and work together to expand that list as the overall solution and team matures. By ensuring each Enterprise Imaging solution is architected correctly and the most appropriate KPIs are closely monitored, healthcare IT teams can more effectively assess application performance, data accessibility and quality, and critical operations metrics.

Avoiding Alert Fatigue

Strings by Paragon implementation experts strongly value the process of collectively defining KPIs. This approach creates an organizational prioritization of what is critical in the monitoring/alerting phase and what might be more appropriate for a dashboard or regular reporting. For instance, Application Performance Monitoring (APM) could generate a broad range of alerts, but only a tiny percentage could be considered critical based on the organization’s dataflow design. Strings experts work together with the customer to establish a weighted approach so that alerting is beneficial – not overwhelming to an administrator. One of the most significant risks with any analytics software is the flooding of data and reporting – where administrators find it more of a daily task versus a tool. Designing and implementing short-term and long-term KPIs could make or break the solution in the early user adoption phase.

Continuous Improvement Has No Finish Line!

Implementation and adoption of Strings by Paragon is a starting point, not the finish line! Healthcare organizations are overwhelmed with various industry challenges, and analytics reporting solutions that create additional work with minimal returns are rarely succesful. The primary objective of Strings services is to ensure that monitoring and alerting are constantly evaluated and measured to ensure the best possible outcomes for your Enterprise Imaging program. Using a regular, agreed upon cadence (e.g. daily, weekly, monthly) Strings experts take a collaborative approach to assessing each active KPI and fine-tune it with administrators to yield maximum results. In the early stages of adopting a new analytics tool, it is critical that end-users are not intimated by the solution but find it helpful and regularly examine its functionality. One of the service offerings is having Strings experts generate a weekly Monitoring Summary Report capturing all triggered alerts. The report is reviewed with the customer for root cause analysis and issue resolution plan. The customer is engaged throughout the analysis cycle and is left more educated about the solution and the issue resolution path. The goal is to empower the end-user further to understand their environment and Strings as an effective tool.

Building a Strong, Long-Term Relationship

Strings by Paragon takes pride in establishing a long-term relationship with its customers. Whether the customer has limited technical resources, or is a highly sophisticated user, the ultimate goal is to continue to innovate together. The Healthcare industry is constantly changing, and data mining/analysis is becoming an absolute need for organizations to stay competitive. Strings experts are committed to optimizing Enterprise Imaging software and working together to innovate new, novel workflow solutions. Establishing a long-term partnership is the key to accomplishing both objectives. The more we know about each other, the more gaps can be identified with opportunities to improve.

Interested in learning more about how Strings by Paragon delivers analytics as a managed service? Subscribe to our blog to learn more.



Filed Under: Uncategorized Tagged With: analytics, analytics-as-a-managed-service, application monitoring, Enterprise Imaging, enterprise imaging analytics, healthcare analytics, key performance indicators, performance monitoring

April 13, 2019 By Laurie Lafleur Leave a Comment

Much excitement in healthcare today revolves around the unlocked potential of population-level ‘big data’, which can be leveraged to inform diagnosis and treatment best practices, enable earlier intervention or proactive mitigation of disease, and support the development of new and innovative medical procedures and interventions. 

A luminary teaching and research organization, the National Institutes of Health (NIH) is an excellent example, conducting numerous research initiatives such as disease prevention and treatment, oncology, precision medicine, and neurotechnology, to name a few. Data normalization plays a significant role in supporting these initiatives by creating structured and consistent datasets to that can be leveraged by data mining technologies that enable the processing and analysis of the significant volumes of information required to perform this type of population-level research – a task that would insurmountable if not automated.

Cleaning and Screening

At NIH data analysis begins with the screening and selection of research patients who meet the specific protocol requirements for various ongoing clinical studies. This involves collection and evaluation of patients’ complete medical records including family and medical history, EMR data, imaging records, and much more to identify relevant genetic characteristics, demographic factors, disease profiles, and health statuses, etc. 

The NIH screens thousands of patient candidates, and as such have sophisticated methods to collate, standardize, and analyze the aforementioned data. First, patient identifiers are normalized upon ingestion and a temporary ‘pre-admit’ MRN is assigned to ensure consistency across diverse data sets and facilitate longitudinal analysis of the complete patient record by NIH systems and researchers throughout the screening process. This also ensures candidate data is kept separate from approved, active research datasets until the patient has been officially selected – at which time the MRN is normalized once again to an active ‘admitted’ profile.

The Role of Diagnostic Imaging

Imaging data is a key component to the research performed at the NIH. As such, researchers collect patients’ previous imaging from various outside facilities. Key imaging attributes on imported studies, such as study and series descriptors, are normalized according to NIH policies to enable automated data analysis and simplify the screening process for researchers by ensuring exams hang reliably within their diagnostic viewers for quick and efficient review. 

As well, the NIH often performs additional advanced imaging exams throughout clinical research projects. To ensure this newly-generated data accurately correlates with the patient’s prior imaging history, can be easily inspected by advanced data analysis technologies, and enables efficient review workflows for researchers the NIH also enforces normalization policies at the modality level. 

Because the NIH works with a large number of trainees and fellows the aforementioned normalization of diagnostic imaging exams provides the added benefit of creating a consistent educational environment for teachers and students, supporting structured and replicable teaching workflows.

Leveraging and Developing Artificial Intelligence and Machine Learning

Artificial Intelligence (AI) and machine learning (ML) may be new entrants into mainstream clinical settings, but to the NIH they have been in use for many years already and represent foundational technologies that play a significant role in enabling automatic identification and comparison key data elements across hundreds or thousands of relevant research studies, which would not be possible if done manually. 

However, there are some data elements that cannot yet be fully analyzed by these technologies. For those where bias or variability between reported values or criteria can exist, manual intervention is still required to make appropriate adjustments based on interpretation of the context within which the values were taken. This is the tricky thing about data normalization – it’s often context sensitive and the level of reasoning required to consider the ultimate research question or goal when evaluating data relevance and normalization needs cannot yet be reliably accomplished by machines. 

For this reason, the NIH continues to support the development and refinement of AI and ML algorithms by leveraging their enormous collection of clean and consistent data to build diverse training environments and collaborating with luminary AI and ML technology organizations to support the further development of advanced context-aware data analysis and clinical use cases. 

Anonymization/De-identification

Another critical aspect of the research performed at the NIH is the anonymization and/or de-identification of personally identifiable information (PII).  In order to adhere to patient privacy and human subjects research protection regulations, at times, the NIH wishes to, or is required to de-identify research data by removing or obfuscating identifiable data that could otherwise reveal a patient’s identity.

This might be done to allow the NIH researchers to conduct secondary analyses without additional human subjects’ protections approvals (per 45 C.F.R. 46) or in order to share data with other research collaborators. The NIH accomplishes this through standard de-identification or coding techniques and normalization policies with the goal of scrubbing data to remove identifiers.  

However, the NIH employs a specialized technique that ensures a link persists between the patient’s anonymized/de-identified and identifiable data through a process defined as an ‘Honest Broker’. This ensures researchers have the ability to follow-up with patients for care-related issues and/or can continue to follow outside patients for additional or future research purposes with appropriate safeguards and approvals.

The real value of data normalization

From tracking tumor response to new treatment programs to developing statistical models for a population’s health characteristics and risk profiles, data curation on the scale achieved by the NIH not only requires the collection of vast amounts of scattered information for analysis, but also mechanisms to transform and normalize incoming data to create well-defined and comparable datasets. For the NIH, data normalization enables the extraction of valuable and predictive clinical insights that can be used to improve both clinical outcomes and population-wide health. 

If you enjoyed this post subscribe to our blog to be notified when new articles are published.


Filed Under: Analytics, Data Management, Healthcare IT, Imaging, Uncategorized Tagged With: Clinical Research, data normalization, Enterprise Imaging, health data, Health IT, healthcare data, healthcare IT, HealthIT, medical imaging data, radiology data

February 4, 2019 By Laurie Lafleur Leave a Comment

Data normalization is an essential activity whenever two or more previously independent facilities are migrated into single, consolidated data repository, as it preserves the integrity of incoming data by avoiding broken links between patient records and problematic collisions between device and study identifiers. 

Jude Mosley, Manager of Information Technology, Imaging at Sentara Healthcare knows this better than anyone. She and her team face this very challenge regularly whenever a new hospital or imaging center is integrated into the 5-hospital network in Norfolk, Virginia. 

Hey, that’s my jacket!

The first challenge is ensuring patient jackets from differing systems are reconciled correctly so that no information was lost or attached erroneously to the wrong patient. Using a master patient index (MPI), Sentara is able to match patients between the source and destination systems and update all incoming studies with the correct patient identifier upon ingestion. This results in a single, consistent set of MRNs across the Sentara network, and eliminates future risk that an old MRN could be mistakenly used. 

The next challenge is ensuring study accession numbers remain unique, and that the migration process doesn’t introduce duplicate accession numbers for different patients. To accomplish this, Sentara employed normalization policies that pre-fixed the accession number and updated the location attribute to a 3-character code representing the source facility for all incoming studies. Not only did this avoid potential collisions between accession numbers, it also added a level of transparency to the migrated data that allowed Sentara to quickly and easily identify its original source. 

The Big Bang theory of modalities

While the migration process provides an excellent opportunity to normalize data on-the-fly, it is often necessary to enforce data normalization policies at the modalities themselves. At Sentara, normalization at the modality level is a well-defined process. With each new modality introduction, update, or site acquisition Sentara ensures that every affected modality adheres to a consistent set of pre-defined policies regarding AE title and station naming conventions. When new modalities are installed, or vendor updates are applied, Sentara requires the vendors to be aware of and sign off on the required data policies to ensure they’re properly applied and thoroughly tested after the installation is complete to maintain adherence.

For larger-scale activities, like a new site acquisition and migration, the PACS team prepares a comprehensive list of all modalities and their new AE titles, station names, and IP addresses, and orchestrates a big-bang update process. While this is no small undertaking, through experience Sentara has refined this process to run like a well-oiled machine. Once complete, Sentara has improved the consistency and supportability of their now-larger infrastructure, and once again ensured that data arrives in a predictable and consistent manner. 

A shining example of normalcy

This case provides excellent examples of how data normalization can address the integration challenges faced by expanding health systems. Not only does it mitigate risk by avoiding data loss and collisions during the migration process, it also measurably improves data quality and reduces support costs in the future by improving the consistency and predictability of systems and data across the entire network.

Our next blog will be taking a more clinical focus, looking at how data normalization can be leveraged to uncover deep insights across population-wide data sources. Sound interesting? Subscribe to our blog to be notified when the next post becomes available!


Filed Under: Data Management Tagged With: data migration, data normalization, Enterprise Imaging, health data, Health IT, healthcare data, healthcare IT, HealthIT

January 28, 2019 By Laurie Lafleur Leave a Comment

Normalization has become an essential activity for any data-driven organization. While it does require an investment in time and resources to define and apply the policies, structures, and values you will find that it is well worth the effort. Not only will you see measurable improvements in the quality and efficiency of clinical workflow, stakeholder satisfaction, and your bottom line – you will also unlock the untapped potential of what could prove to be one of your organization’s biggest assets – your data! That being said, let’s take a look at the various methods that can be used to make sure your data falls in line and conforms to your new norms.

What does normal look like anyway?

The first step is identifying and defining the attributes to be normalized. This begins with a close look at your organization’s key challenges and goals. Having trouble with data integrity, fragmentation, and collisions? Take a look at how unique ID’s are assigned, reconciled, and managed at the facility, system, device, and patient levels. Hearing lots of complaints from your clinical team about broken and inefficient workflow? Consider looking at the variations in procedure, exam, and diagnostic naming conventions across your enterprise. Once the troublesome attributes have been defined key stakeholders should be engaged to define what the ‘gold standard’ values should be, and map these to the ‘dirty’ values that are plaguing your system.

Keeping it clean on-the-fly

Once the normalized values have been defined transient systems, such as DICOM and HL7 routers, or destination systems, such as EHR/EMRs, Picture Archiving and Communication Systems (PACS) or Vendor Neutral Archives (VNA), can often be configured to inspect the data as it arrives to dynamically identify and adjust any inconsistencies and ensure what is stored and shared adheres to normalization policies. This is accomplished through a normalization or ‘tag morphing’ rules engines that are able to inspect incoming data on-the-fly, identify deviations from normalization policies using pattern matching algorithms, and apply the appropriate transformations based on the pre-defined set of consistent values. This ensures incoming data is clean, consistent, and reliable – regardless of its original source or format.

As well, it enables rapid integration of outside facilities and systems resulting from mergers or acquisitions as new systems and modalities can be integrated quickly without requiring updates to hanging protocols, routing and retention policies, reporting systems, etc.

Finally, it mitigates the impact of any unforeseen changes that may occur due to vendor updates at the modality, which can sometimes alter attribute values such as series descriptions. This is most common among complex multi-slice acquisition devices, and ultimately results in broken hanging protocols and frustrated radiologists.

Garbage in, garbage out

In some cases, it may be necessary to enforce data normalization policies at the modalities themselves. This is especially important if receiving systems do not provide robust tag morphing capabilities, leaving you without the ability to enforce normalization policies on-the-fly. As well, if you find that your technologists are frequently required to manually enter or adjust data values at the modality, then added variability and potential for error in the resulting data sets are more likely to occur. This may not always be caught by a discerning rules engine. In either case, why not take the opportunity to ensure your modalities are sending clean data from the get-go? As the old adage says: garbage in, garbage out.

When you’re going where the grass is greener

If you’re considering a retiring and replacing ageing systems, data migrations present an excellent opportunity to clean and normalize existing data as it moves between systems, providing immediate benefits like better access and filtering of relevant priors, improved reading efficiency through consistent and reliable hanging protocols, and the ability to incorporate historic data into analytics, AI, and deep learning applications.

As well, it positions you to minimize the effort involved in any future system replacement or migration activity by simplifying the configuration of any system features that rely on specific attribute values or structures to function effectively. For instance, hanging protocols are not typically transferrable between vendors’ systems and therefore need to be re-created whenever a PACS replacement or viewer change occurs. The consistency of normalized data facilitates rapid configuration of protocols within a new system, as the complexity associated with configuring multiple protocols for each distinct layout, or the viewer provided lexicons required to accommodate or ‘map’ the various combinations and permutations of inconsistent protocol, study, or series information is eliminated. The same holds true for other attribute-driven features including but not limited to reading worklists, routing or forwarding rules, information lifecycle management policies, and analytics and deep learning.

That sounds like a lot of work…

More often than not the perceived effort of defining such practices and data structures seems overwhelming to many already busy IT departments and project teams. This often results in a choice to forego data normalization activities in favour of saving some time and effort, or simply due to lack of financial and human resources.

While it may be true that data normalization is no small task, the benefits far outweigh the cost of the initial investment, and many organizations are now realizing the strategic value of data normalization initiatives. The heaviest lift, by far, is the process of gathering and analyzing existing data and distilling it into normalized data structures and values, which is a one-time effort that will yield immediate and recurring dividends by creating an actionable data repository that supports clinical and business continuous improvement initiatives.

By now you might be thinking, “this sounds good in theory, but it’s pretty anecdotal. Where’s the real-life evidence that data normalization is worth the effort?”

Our next posts will include real-world examples of how some luminary healthcare organizations have leveraged data normalization to achieve a variety of measurable benefits. Subscribe to our blog to be notified when the next post becomes available!


Filed Under: Analytics, Data Management, Healthcare IT, Workflow Tagged With: data migration, data normalization, Enterprise Imaging, health data, Health IT, healthcare data, healthcare IT

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Copyright © 2023 · Paragon Consulting Partners, LLC · 500 Capitol Mall, Suite 2350, Sacramento, CA 95814 | 916-382-8934