Difference between revisions of "Journal:Timely delivery of laboratory efficiency information, Part I: Developing an interactive turnaround time dashboard at a high-volume laboratory"

From LIMSWiki
Jump to navigationJump to search
m (Minor fixes)
Line 75: Line 75:
{| border="0" cellpadding="5" cellspacing="0" width="1200px"
{| border="0" cellpadding="5" cellspacing="0" width="1200px"
   | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Fig. 1.''' Flowchart depicting all steps required to develop a turn-around time dashboard, South Africa, 2018</blockquote>
   | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Fig. 1.''' Flowchart depicting all steps required to develop a turnaround time dashboard, South Africa, 2018</blockquote>
Line 123: Line 123:
{| border="0" cellpadding="5" cellspacing="0" width="1100px"
{| border="0" cellpadding="5" cellspacing="0" width="1100px"
   | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Fig. 3.''' An example of the Microstrategy Desktop bubble dashboard chart used to report total turn-around time data for an example site’s week’s data, South Africa, 2018. The percentage within cutoff turnaround time is reported on the ''x''-axis with the 75th percentile turnaround time on the ''y''-axis. The bubble size indicates test volumes. Reference lines were added at 85% within stipulated turnaround time cutoff on the ''x''-axis. Each test within the test basket is colour coded with the key provided on the right. Outlying tests are immediately visible.</blockquote>
   | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Fig. 3.''' An example of the Microstrategy Desktop bubble dashboard chart used to report total turnaround time data for an example site’s week’s data, South Africa, 2018. The percentage within cutoff turnaround time is reported on the ''x''-axis with the 75th percentile turnaround time on the ''y''-axis. The bubble size indicates test volumes. Reference lines were added at 85% within stipulated turnaround time cutoff on the ''x''-axis. Each test within the test basket is colour coded with the key provided on the right. Outlying tests are immediately visible.</blockquote>

Revision as of 16:12, 24 January 2021

Full article title Timely delivery of laboratory efficiency information, Part I: Developing an interactive turnaround time dashboard at a high-volume laboratory
Journal African Journal of Laboratory Medicine
Author(s) Cassim, Naseem; Tepper, Manfred E.; Coetzee, Lindi M.; Glencross, Deborah K.
Author affiliation(s) National Health Laboratory Service, University of the Witwatersrand
Primary contact Email: naseem dot cassim at wits dot ac dot za
Year published 2020
Volume and issue 9(2)
Article # a947
DOI 10.4102/ajlm.v9i2.947
ISSN 2225-2010
Distribution license Creative Commons Attribution 4.0 International License
Website https://ajlmonline.org/index.php/ajlm/article/view/947/1482
Download https://ajlmonline.org/index.php/ajlm/article/download/947/1479 (PDF)


Background: Mean turnaround time (TAT) reporting for testing laboratories in a national network is typically static and not immediately available for meaningful corrective action and does not allow for test-by-test or site-by-site interrogation of individual laboratory performance.

Objective: The aim of this study was to develop an easy-to-use, visual dashboard to report interactive graphical TAT data to provide a weekly snapshot of TAT efficiency.

Methods: An interactive dashboard was developed by staff from the National Priority Programme and Central Data Warehouse of the National Health Laboratory Service in Johannesburg, South Africa, during 2018. Steps required to develop the dashboard were summarized in a flowchart. To illustrate the dashboard, one week of data from a busy laboratory for a specific set of tests was analyzed using annual performance plan TAT cutoffs. Data were extracted and prepared to deliver an aggregate extract, with statistical measures provided, including test volumes, global percentage of tests that were within TAT cutoffs, and percentile statistics.

Results: Nine steps were used to develop the dashboard iteratively, with continuous feedback for each step. The data warehouse environment conformed and stored laboratory information system (LIS) data in two formats: (1) fact and (2) dimension. Queries were developed to generate an aggregate TAT data extract to create the dashboard. The dashboard successfully delivered weekly TAT reports.

Conclusion: Implementation of a TAT dashboard can successfully enable the delivery of near real-time information and provide a weekly snapshot of efficiency in the form of TAT performance to identify and quantitate bottlenecks in service delivery.

Keywords: turnaround time, laboratory efficiency, interactive dashboard, indicators, performance assessment


Turnaround time (TAT) is an important performance indicator for a laboratory service. It refers to the time from first registration in a laboratory to a result released in the laboratory information system (LIS).[1] Historically, within the National Health Laboratory Service (NHLS) of South Africa, TAT reporting was provided in annual and quarterly static management reports generated by the LIS, TrakCare[2], which also provided ad hoc reporting for use at the laboratory level. These reports are printed to provide a snapshot of TAT reporting and are suited for staff working at the laboratory level. At a national level, the corporate data warehouse (CDW) of the NHLS collated global TAT data from over 266 testing laboratories based on predetermined, annual performance plan cutoffs. National TAT cutoffs are set by expert committees of different pathology disciplines, with final confirmation from senior management before implementation. These cutoffs are set with provisions for all levels of the service laboratory: from low-volume laboratories with limited test repertoires to high-volume testing laboratories with extensive test offerings, including specialized testing, such as viral load testing. However, a large percentage of NHLS laboratories have 24-hour service and have emergency units in the hospitals in which they are housed; such laboratories have locally stricter TAT cutoffs for emergency and other local tests than are reflected in the national cutoffs for all samples.

Historically, the NHLS CDW TAT reports generated were static and reported only the mean TAT. Turnaround time data have a positive skewness, that is, a long tail to the right, meaning that the mean will be greater than the median. This implies that TAT data reported previously[1][3], reporting the mean TAT, masks good performance while concealing poor efficiency. Further, neither the current LIS nor CDW reports enable detailed analysis of the information or drilling down to laboratory- or test-level data for additional information about TAT efficiencies. Data presented at the first conference of the African Society for Laboratory Medicine in Cape Town, South Africa, in 2012, reported daily laboratory test volumes and mean TAT for authorized results, stratified by individual laboratories[4], providing a snapshot of performance. This enabled review of CD4 laboratory efficiency for a national program and provided important insights into laboratory operations.

A recent evaluation of TAT for HIV programs reported a methodology to further categorize laboratory TAT performance using three additional measures[3]: (1) median TAT, (2) 75th percentile TAT (tail size), and (3) percentage within cutoff TAT. These data were graphically presented using a scatter plot of percentage of samples within the TAT cutoff (x-axis) against the TAT 75th percentile (y-axis), categorized into four quadrants of performance to help identify the level of laboratory performance in a national program. This approach made it easier to identify both good performers and outliers in the same analysis.[3] The report was generated in Excel and included the raw monthly data, a scatter and bar graph, and a summary table of all laboratories per business region, the 75th percentile, the percentage within cutoff values, and TAT component 75th percentile values. This report was primarily distributed to managers of all HIV-related operational programs, that is, CD4, viral load and tuberculosis testing, and early infant diagnosis for review and intervention, and not shared across the network of testing laboratories.

Senior management in Gauteng, South Africa expressed a need for a TAT monitoring tool that would enable them to better manage their laboratories and identify sites with poor TAT performance. Given the static nature of the historic TAT reporting in the organization, an interactive system that offered information to enable review of performance, including outliers (tail size assessment), while confirming that sites were meeting cutoffs, would be a useful tool to enable business and laboratory managers alike to monitor their efficiency via TAT performance in real time. The concepts already developed and in use as Excel reports for the HIV programs were the starting point for developing a reporting dashboard for use across multiple disciplines and tests done throughout the network of 241 testing laboratories of the NHLS in South Africa.

The aim of this study was to develop an easy-to-use information system, in a dashboard format. This would enable weekly reporting of TAT data as a snapshot of performance. To achieve this, a number of changes to current TAT reporting had to be addressed. These changes included (1) moving from program-specific, single-test TAT reporting previously used[3] to a specific set of high-volume tests, (2) adopting TAT measures reported by Coetzee et al.[3], (3) identifying dashboard software to use, and (4) identifying the target users.[3] The specific set of tests (or "basket" of commonly requested tests) should be representative across the primary pathology disciplines.

This article sets out to describe the process followed to develop the TAT dashboard—using available software—that could provide a weekly summary of national, business unit, and laboratory-level TAT performance for a basket of tests. For the purposes of this article, data from a single participating tertiary laboratory were used to illustrate the data distribution, as it represents an example of a testing facility that performed all tests reviewed in the prescribed national basket. This was done to show the respective levels of drilling functionality of the dashboard and to iterate the interactive properties, while demonstrating how the dashboard can be used to assess performance and identify outliers for intervention.


Ethical considerations

Ethics clearance for this study was obtained from the University of the Witwatersrand (M1706108). Only anonymized laboratory data, containing no patient identifiers, were used for the study.

Study design

A retrospective descriptive study design was used to analyze and report laboratory TAT data for a specific set of tests (Table 1).

Tab1 Cassim AfricanJLabMed2020 9-2.jpg

Table 1. Sample of a turnaround time dashboard table that lists the outcomes for the basket of tests, South Africa, 2018

Steps to developing a turnaround time dashboard

The various steps required to develop the dashboard are summarized in a flowchart (Figure 1).

Fig1 Cassim AfricanJLabMed2020 9-2.jpg

Fig. 1. Flowchart depicting all steps required to develop a turnaround time dashboard, South Africa, 2018

Sample population and turnaround time definition

Using convenience sampling, data were selected from among the tests performed at a single busy academic laboratory in Gauteng for one week during 2018. Aside from global TAT reporting, the dashboard should adopt the three TAT measures reported by Coetzee 'et al.[3] These include pre-analytical, analytical, and post-analytical components of TAT components, namely: (1) the time from first registration at the source laboratory to registration of the referral at the testing laboratory (LAB-to-LAB TAT), (2) time from registration at the testing laboratory to results being populated by the LIS interface (TESTING TAT), and (3) time from result population by the LIS interface to manual review and authorization by senior laboratory staff (REVIEW TAT).

Test basket development and inclusion and exclusion criteria

Focus group meetings were arranged with local area and business managers to define a test basket for the dashboard. The principles adopted were as follows: (1) measure a limited number of tests with a focus on the tests with the highest volumes of tests performed; (2) measure data for the indicator analyte (as a proxy) for specific panel tests (for example, the creatinine test was used as an indicator for assessment of urea and electrolyte test performance); (3) use the annual performance plan TAT cutoffs; and (4) deliver dashboard files via email (due to bandwidth constraints). All samples within this organization test basket were included in the example analysis and included the most commonly requested tests selected from hematology, coagulation, HIV-tuberculosis, and chemistry (Table 1). A mapping table was developed to identify the LIS test sets and items to be reported. For each test, the TAT cutoff was also stipulated. The mapping table was used to guide the data extract.

Data extraction

For the purposes of demonstrating how the data were manipulated to create the dashboard, data were extracted for the week of September 2–8, 2018 from the CDW from four data sources: (1) the Operational Data Store that contained the original LIS data (Figure 2), (2) the "CDW fact" that reported test volumes, (3) the test method dimension[5] (provides details on the test such as a unique identifier, discipline, test method code and name and national number from the CDW), and (4) TAT cutoff dimension (captures annual performance plan cutoffs) (Figure 1).[6] Using an outer join, data from these four data sources were prepared as a temporary detailed table. The first temporary table limited data to the test basket, adding the TAT cutoffs and provided information using the laboratory hierarchy (region, business unit, and laboratory). Because this table would be too large to use for the dashboard, and assuming email delivery of the final report, two additional steps were used to create a smaller aggregate dataset. The mean, standard deviation, 75th percentile, and percentage within TAT cutoff were added. All TAT data were reported in hours. The final temporary table was exported as a Microsoft Excel (Redmond, Washington, United States)[7] worksheet and imported into the MicroStrategy Desktop analytics tool (Providence, Virginia, United States).[8] After the data were imported, the respective dashboard sheets were developed to include relevant TAT information for all levels of management.

Fig2 Cassim AfricanJLabMed2020 9-2.jpg

Fig. 2. Visual representation of data preparation steps to transform and move raw turnaround time data, South Africa, 2018. Data were moved from the Operational Data Store (the laboratory information system) to a production server which consists of facts and dimensions. The production server structures the data by setting sets of data to specific target areas to facilitate final reporting in dashboard format.

Criteria for an effective dashboard

For any dashboard to be effective, it needs to adhere to a number of key outputs: it should (1) be visually engaging and easy to view and understand the TAT data displayed; (2) enable dynamic drilling down from a bird’s-eye view to a local perspective, that is, from national or provincial level down to the laboratory level per test; (3) provide a report on a weekly basis for a TAT snapshot view; and (4) highlight TAT outliers for laboratory managers to follow-up and direct corrective action. From a more technical perspective, the dashboard also had to include additional features that included (1) conditional formatting to highlight good, average, and poor performance; (2) provision of various reporting formats such as bubble charts, tables, and bar charts; (3) the ability to import data for a variety of formats; and (4) theability to send the weekly dashboard data file via email in a small file format (≤ 6 MB).

Data analysis and visual dashboard display

The dashboard displays (sheets) developed were as follows: (1) a bubble chart reporting the percentage within TAT cutoffs and 75th percentiles, (2) a table (see Table 1) displaying the bubble chart data, and (3) the 75th percentile for each phase of the component TAT reported by the test method. A bubble chart dashboard sheet was created to include: (1) the 75th percentile TAT (y-axis), (2) the percentage within TAT cutoff (x-axis), (3) the test volumes (size by and color by), and (4) the test method (color by and break by). The region codes and laboratory names were added to the dashboard as filters (radio buttons and search box display styles). An 85% within TAT cutoff reference line was added to aid identification of specific tests and associated laboratories with TAT that were outside of the TAT cutoffs. The data used to generate the bubble chart were also reported as a table in a separate sheet. The table listed the test name, total number of tests, TAT cutoff, the percentage within TAT cutoff, and 75th percentile TAT for the basket of tests reported on.

The table uses "stop highlighting" to denote the different percentage within TAT cutoff as follows: (1) 85% or higher in green, (2) 75% – 84% in orange, and (3) under 75% in red. Lower percentage within TAT cutoff and higher 75th percentiles indicate an increased risk that any given laboratory is not adequately delivering patient reports that will enable timely clinical intervention.

A component TAT sheet was created as a clustered horizontal bar chart to display the component TAT with (1) test method name (y-axis) and (2) component 75th percentile TAT (LAB-LAB TAT, TESTING TAT and REVIEW TAT), differentiated by color. The testing laboratory name was added to enable refining and filtering data down to the laboratory level.


The successfully developed dashboard enabled delivery of weekly TAT data. Data from 45,599 reported samples for the week of September 2–8, 2018 were utilized to demonstrate the dashboard development described here. The 75th percentile and the percentage of tests within stipulated cutoff for each test in the basket were visualized on the dashboard landing screen. This allowed the user to view data by test at the national, provincial and laboratory levels to visually identify outlying tests. The dashboard contained three individual sheets: the bubble chart, the TAT table, and the component TAT sheets.

Figure 3 shows typical weekly TAT data presentation outcomes as a bubble chart dashboard. In this example data set, only one test method, rapid plasma reagin (syphilis), failed to meet the 85% TAT cutoff and is reported as a small grey dot on the bubble chart dashboard. For this test, the reported percentage within cutoff was 83.8%, within the 75% and 84% category highlighted as orange in the prior Table 1. A cluster of tests in the bottom right reported 90% or higher within TAT cutoff with a 75th percentile TAT of eight hours or less. Only one test reported a percentage within TAT cutoff between 85% and 89%: total cholesterol (red dot). Higher test volumes were reported for the HIV viral load (n = 19,055) and creatinine (n = 8,857) tests.

Fig3 Cassim AfricanJLabMed2020 9-2.jpg

Fig. 3. An example of the Microstrategy Desktop bubble dashboard chart used to report total turnaround time data for an example site’s week’s data, South Africa, 2018. The percentage within cutoff turnaround time is reported on the x-axis with the 75th percentile turnaround time on the y-axis. The bubble size indicates test volumes. Reference lines were added at 85% within stipulated turnaround time cutoff on the x-axis. Each test within the test basket is colour coded with the key provided on the right. Outlying tests are immediately visible.

For the TAT table, results for the bubble chart are summarized (see Table 1) per test. At the 75th percentile TAT, no test exceeded the cutoff TAT. A 100% within cutoff TAT was reported for three tests: activated partial thromboplastin time, full blood count, and platelet count. Similarly, six tests reported a percentage within cutoff TAT between 95% and 99%.

The dashboard also reports component TAT in hours (Figure 4), including (1) LAB-TO-LAB, (2) TESTING, and (3) REVIEW times, with the tail size in hours for the distribution of each component TAT. In any given laboratory, some samples tested are local (from the immediately adjacent hospital), while other samples are referred for testing from nearby hospitals where these tests are not available. As such, a zero LAB-TO-LAB component indicates that the samples were not referred but are samples collected and tested locally. For referred samples included in the example data set (see Figure 4, CD4 antiretrovirals, D-Dimer, and HIV viral load, among others), the LAB-TO-LAB component TAT 75th percentile represents the inter-laboratory referral time, ranging in this instance from 12 to 23 hours (Figure 4). In the testing phase, TAT ranged from 0.25 to 63 hours (where 63 hours represented a single test, the rapid plasma reagin, syphilis, that was regarded locally as an outlier; see Table 1 for detail). The 75th percentile review TAT was two hours or less across all tests.

Fig4 Cassim AfricanJLabMed2020 9-2.jpg

Fig. 4. MicroStrategy Desktop dashboard bar chart used to report the component turnaround time data for an example site’s for the week of September 2–8, 2018, South Africa. The components reported are LAB-LAB (inter-laboratory referral time), TESTING (time from registration to testing), and REVIEW (time from testing to review) turnaround time components.


Access to information in an interactive dashboard format has previously enabled retrieval of health data for immediate clinical use in the NHLS in South Africa.[3] A similar approach has been applied and demonstrated in this work for TAT data. The dashboard described here provides an interactive, weekly snapshot of TAT performance, together with information about TAT distribution, tail size (outlier) assessment[1][3], to varying levels of laboratory managers across the NHLS, to enable timely intervention where poor service delivery is identified.

The dashboard is comprised of a few basic parameters that act together to provide information about TAT. Date stamping of samples in the LIS is a prerequisite to provide the basic information necessary to detail TAT linked to any given sample. Together with relevant sample identification datalogged, data is transferred to a central database for careful curation. Later, TAT data extraction is performed using standard data query tools. In the instance of a wider network of laboratories operating within the same organization, such as the South African NHLS, LIS data is stored using a decentralized architecture. Aggregate data, in the format described above, can then be collated and used to develop national TAT dashboards.

The dashboard described in this study simplifies presentation of complex data by enabling visualization of any given laboratory’s efficiency. For the purpose of this study and to demonstrate the effectiveness and simple format of the dashboard developed, data from a single busy laboratory were used to illustrate the different outputs of the dashboard (graphs and table). The example data used here reveal how the dashboard can be used to identify tests that are not meeting national (or local) cutoff criteria. In the example presented, rapid plasma reagin (syphilis) testing was noted as an outlier, as it did not meet the organization-stipulated 85% within cutoff TAT. The summary table (example shown in Table 1) also provides a spreadsheet format table of the relevant tests either meeting, or failing to meet, the national cutoff criteria. The additional information on TAT component analysis further assists management to identify those areas of laboratory testing, within the respective pre-analytical, analytical, and post-analytical components, that may need investigation for improvement.

The dashboard was successfully rolled out to all NHLS testing laboratories; weekly data are currently received by these laboratories for review. The dashboard development included a drill-down function into the performance of a particular test to see results by testing laboratory, or business unit. The addition of tail size measures[1][3] has also enabled managers to identify less efficient areas of their laboratory services with outlying performance, seen in the example case described in this article (rapid plasma reagin – syphilis), which would have been otherwise missed using conventional reporting alone (using mean TAT reporting). In addition, to enable practical sample-by-sample audit, individual samples that did not meet cutoff criteria were identified for follow-up in an additional summary table sheet (added at the request of laboratory managers to enable better intervention; not shown). Use of the dashboard has also led to laboratory process changes with improved individual component TAT. For example, post-analytical TAT improvements included implementation of an "auto review" feature.[9][10] With respect to analytical delays identified, testing delays could be correlated with instrument breakdown logs from the laboratories or instrument suppliers to identify reasons for prolonged testing TAT.[9] The impact of dashboard usage on improving TAT is described in detail in the companion article in this issue.[9]

Risk management teaches that not all errors can be predicted.[11] However, it is only through active review of quality processes that delays, errors, and problems can be detected earlier to enable corrective action. Thus, critical to managing risk is the continuous and ongoing evaluation and assessment of procedures and processes to ensure that the same errors are not repeated. Here, human capital is key to the sustainability and success of any dashboard implementation. Noble et al.[11] reported that only the persistence and interest of laboratory personnel to maintain quality can ensure smooth and rapid progress of error detection (and correction back to quality) that is fast and sustainable.

One of the fundamental lessons learned from the development of the dashboard described here is that providing tools to assess TAT performance does not in itself imply corrective action or improvement. The dashboard is merely a tool that enables managers to effectively and efficiently ensure procedural excellence. Nkengasong and Birx[12] also suggest that in order for innovation to be adopted, and sustainable innovation and performance enablers should both energize and incentivize laboratories across four pillars: implementation, measurement, reward, and improvement. A culture of diligence and willingness on the part of managers to meaningfully use information provided in the dashboard is thus important to enable making consequential changes at the laboratory level. Political will and strong senior leadership are also needed to make systems, such as those introduced with the dashboard described here, both functional and sustainable.[12] This can be done by appropriately recognizing and rewarding laboratories and personnel who use the tools provided.

Pre-analytical errors should not be underestimated, as they can increase both testing errors and TAT.[12] In the example laboratory performance reported here, all four referred test TAT outcomes were compromised due to pre-analytical delays. Documenting these delays and acting to reduce pre-analytical time, including travel time, and time spent in receiving centers prior to sample registration, can be used to streamline services.

Another outcome reported by managers using the dashboard was that the information could be documented week-by-week to provide objective evidence to document and motivate for additional resources required to achieve TAT cutoffs, for example including additional sample collection schedules, increasing testing capacity, and providing motivation for auto-review and authorization modules.[9]

It is important in the context of a resource-poor setting to highlight that the dashboard described here was developed without specific funding, relying only on the collaborative effort of NHLS staff (the authors) with data management or MicroStrategy skills. Data is routinely transferred from the LIS to the CDW, where is it collated and carefully curated for downstream research and operational needs. Initial formats were undertaken using CDW-extracted data analyzed in MS Excel to create simple charts plotting 75th percentile and median TAT, by laboratory, for annualized or quarterly aggregated TAT data. Thereafter, analyses were extended to create week-by-week practical and usable worksheets so that individual laboratories could view current data. Using MicroStrategy, a freely available software program, a dashboard was developed to enable automatic presentation of the data in a visible interactive format (with the snapshot aggregate data file emailed to users weekly) to facilitate automated more immediate access to current TAT data. Future planning includes providing live data in the dashboard, facilitated by extending local bandwidth capacity and immediate real-time analysis of data within the CDW itself.


This article outlines the database management and methods used for the development of a dashboard that enables presentation of weekly TAT data to relevant business and laboratory managers, as part of the overall quality management portfolio of the organization. This novel approach ensures the delivery of quality, timely pathology reporting by the South African NHLS and, ultimately, better patient care. Training on the use of the dashboard is essential to ensure that users are competent. Users need to both understand the principles applied in the dashboard as well as the functionality embedded in the dashboard. Political will and leadership are vital to ensure that deficiencies identified by the dashboard lead to better quality and more efficient and timely laboratory services.

As African laboratories move toward increasing the number of centers that prepare for or achieve accreditation[12], it is vital that laboratories are aware of the commitment needed to continually monitor, evaluate, and re-assess their status quo. Such commitment will ensure that the quality of the laboratory services they offer shows improvement over time. It is therefore important to consider what is required to achieve and maintain the quality of testing to avoid costly pitfalls[13] and inaccurate or delayed result reporting. In this regard, although much of the focus of quality management is placed on quality of tests themselves, time management in a laboratory is as crucial as assuring the quality of the tests performed. Without timely delivery of patient results, appropriate and meaningful clinical management of patients cannot be accomplished.


The data presented in this study focus on the within-laboratory network TAT and did not record or assess delays outside the laboratory capture net. Pre-analytical TAT referred to in this work denotes the time taken to transport a sample from a receiving laboratory to a testing laboratory. Ideally, sample tracking systems that relay tracking data to the central data warehouse, linked to discrete samples, will enable total end-to-end service assessment of TAT.

Lessons from the field

The dashboard subsequently developed has been extended to the top 22 highest volume tests performed across the organization but does not report data for pathology sections like microbiology or anatomical pathology disciplines, or the more specialized units like cytogenetics or immunology. Plans are underway to broaden the test basket and to additionally include critical tests such as cardiac troponin levels, shown in other work (not reported here) to have TAT that currently falls beyond meaningful clinical impact.

The data presented provide only a weekly snapshot. As technology permits, it is important to extend and broaden development of this dashboard at the database warehouse level using business intelligence analytics tools that enable reporting real-time data. It is envisaged that laboratories could use large screens within laboratories themselves to track real-time progress for immediate response and corrective action, where required. Alternatively, remote management could be facilitated using specially developed mobile devices to display live TAT performance.

The currently reported dashboard data does not distinguish between different levels of service (i.e., tertiary versus primary and secondary hospitals) with different levels of patient care (intensive care unit, STAT-lab, trauma departments). Data is aggregated and compared to the national cutoff for each test in the dashboard presented here. However, individual laboratories have established locally-relevant TAT cutoffs for emergency and routine contexts depending on the level of care (primary versus tertiary).


The authors thank area, business, and laboratory managers in Gauteng for their participation in the pilot project. The test basket used in the development of this dashboard was determined in consultation with this group. The authors also thank Mr. Bahule Motlonye of the National Health Laboratory Service for his input during the pilot phase of the dashboard development undertaken during 2017.

Author contributions

D.K.G. supervised the study by providing leadership and oversight as the project leader. N.C., M.E.T. and L.M.C. designed the study, developed the methodology and conducted the research. M.E.T. developed the data systems to deliver a dashboard. M.E.T., N.C. and L.M.C. conducted the data analysis. D.K.G. reviewed the data, provided editorial comments and technical input. All authors contributed to the manuscript development.


The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.

Competing interests

The authors declare no conflict of interest.


  1. 1.0 1.1 1.2 1.3 Hawkins, R.C. (2007). "Laboratory turnaround time". The Clinical Biochemist 28 (4): 179–94. PMC PMC2282400. PMID 18392122. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=PMC2282400. 
  2. "TrakCare Lab Enterprise". InterSystems. https://www.intersystems.com/products/trakcare/trakcare-lab-enterprise. Retrieved 03 December 2018. 
  3. 3.0 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 Coetzee, L.-M.; Cassim, N.; Glencross, D.K. (2018). "Using laboratory data to categorise CD4 laboratory turn-around-time performance across a national programme". African Journal of Laboratory Medicine 7 (1): 665. doi:10.4102/ajlm.v7i1.665. PMC PMC6111574. PMID 30167387. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=PMC6111574. 
  4. Drury, S.; Coetzee, L.-M.; Cassim, N. et al. (2012). "Using Central Data Warehouse (CDW) Reports for Monitoring CD4 Laboratory Workload and Related Turn- Around-Time (TAT)". Proceedings of the First International Conference of the African Society for Laboratory Medicine. doi:10.13140/2.1.1277.7922. 
  5. Chapple, M. (2018). "Facts vs Dimensions Tables in a Database". Lifewire. https://www.lifewire.com/facts-vs-dimensions-1019646. Retrieved 03 December 2018. 
  6. Carmona, S.; Macleod, W. (November 2016). "Development of paediatric, VL and CD4 dashboards and results for action reports" (PDF). Right To Care. http://www.righttocare.org/wp-content/uploads/2016/11/Programme.pdf. Retrieved 03 December 2018. 
  7. "Apps and services". Microsoft. https://www.microsoft.com/en-za/microsoft-365/products-apps-services. Retrieved 03 December 2018. 
  8. "Download MicroStrategy Desktop". MicroStrategy. https://www.microstrategy.com/en/get-started/desktop. Retrieved 03 December 2018. 
  9. 9.0 9.1 9.2 9.3 Cassim, N.; Coetzee, L.-M.; Tepper, M.E. et al. (2020). "Timely delivery of laboratory efficiency information, Part II: Assessing the impact of a turn-around time dashboard at a high-volume laboratory". African Journal of Laboratory Medicine 9 (2): a948. doi:10.4102/ajlm.v9i2.948. PMC PMC7203269. PMID 32391245. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=PMC7203269. 
  10. Krasowski, M.D.; Davis, S.R.; Drees, D. et al. (2014). "Autoverification in a core clinical chemistry laboratory at an academic medical center". Journal of Pathology Informatics 5 (1): 13. doi:10.4103/2153-3539.129450. PMC PMC4023033. PMID 24843824. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=PMC4023033. 
  11. 11.0 11.1 Noble, M.A.; Martin, R.; Ndihokubwayo, J.-B. (2014). "Making great strides in medical laboratory quality". African Journal of Laboratory Medicine 3 (2): 256. doi:10.4102/ajlm.v3i2.256. PMC PMC5637788. PMID 29043199. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=PMC5637788. 
  12. 12.0 12.1 12.2 12.3 "Quality matters in strengthening global laboratory medicine". African Journal of Laboratory Medicine 3 (2): 239. 2014. doi:10.4102/ajlm.v3i2.239. PMC PMC4956090. PMID 27453824. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=PMC4956090. 
  13. "Building resources to meet evolving laboratory medicine challenges in Africa". African Journal of Laboratory Medicine 7 (1): 915. 2018. doi:10.4102/ajlm.v7i1.915. PMC PMC6296021. PMID 30568895. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=PMC6296021. 


This presentation is faithful to the original, with only a few minor changes to presentation. Grammar was cleaned up for smoother reading. In some cases important information was missing from the references, and that information was added. The URL to Carmona and Mcleod's Development of paediatric, VL and CD4 dashboards and results for action reports is dead, and an archived or replacement version could not be found on the internet. The orginal reference #11 looks like a repeat of #9 and was not included for this version.