Urgent Alert

From 1 April 2022 this website will not be updated.

  For the latest North West London health and care information, visit www.nwlondonics.nhs.uk.

How will outcomes be tracked and measured?

Once you have established outcomes and metrics for Whole Systems Integrated Care, it is necessary to build a mechanism for practically measuring and then tracking outcomes. While outcomes can be qualitative, we will need to measure and track them in a quantitative manner. This requires effective data collection, data consolidation, data visualisation and management structures to review and act on the metrics measured. Many of the practical issues around informatics are discussed in Chapter 11: What informatics functionality will we need?

Where possible, building outcome systems should be done early, so that we can establish a baseline which future programmes can be compared against, and will make providers more comfortable being measured against metrics which are familiar to them.

WHAT WILL ENABLE EFFECTIVE DATA COLLECTION?

Effective data collection makes use of:

  • A small number of important metrics. With careful selection of metrics, you can encourage positive behaviours in a broad range of areas which you do not need to measure. It also reduces the difficulty of data collection which makes it easier to review the data more frequently and quickly.
  • Metrics that constrain a provider’s model of care as little as possible. Insofar as metrics need to capture process, they should avoid prescribing a particular way of working.
  • A mixture of old and innovative metrics. Old metrics are important because they provide a basis for comparison over time and are more trusted by both commissioners and providers. Innovative metrics give commissioners new ways to reward providers for delivering the best outcomes for service users.
  • Metrics which are unambiguous. The data will be collected by a large number of providers in different settings. As far as possible, metrics should not be affected by norms or conventions in different settings.
  • Metrics which are easy to collect. Where metrics are onerous, completion rates and data accuracy falls. There is therefore a trade-off between metrics that capture the real issues but are complicated and metrics which allow reliable data collection.

HOW CAN WE CREATE A DASHBOARD?

Quick, real-time review of metrics and outcomes requires the creation of a visualisation dashboard, which allows providers and commissioners to identify areas where outcomes do not match aspirations. The exhibit below summarises what a dashboard is and what it is not.

Best-practice dashboards often have the following elements:

  • Place the most important metrics side-by-side on a single page.
  • Present key metrics from each important aspect, spanning quality of life, quality of care, financial sustainability, professional experience, operational performance.
  • Use colour and visuals to quickly draw attention to the areas that need it.
  • Are updated frequently, ideally automatically, and reviewed frequently.
  • Allow users to drill down into further detail on areas that trigger concern.

EXAMPLE DASHBOARD: THE ARKANSAS HEALTH CARE QUALITY DASHBOARD

The agency for Healthcare Research and Quality, run through the US department of Health and Human Services, keeps a dashboard for different states to rate their performance on different outcome measures.

An example of the outcomes measured is shown in the exhibit below, and a more detailed list of the hundreds of metrics that they use to get to these outcomes can be found at nhqrnet.ahrq.gov.