Contributing lab leader: Busadee Pratumvinit, MD
Patient-based real-time quality control: What does it mean for the lab
Quality control in laboratories is defined by the World Health Organization as “a set of procedures to continuously assess laboratory work and the emergent results”.1 The process is essential to ensure that equipment is performing as it should, and that test results are accurate and consistent.2,3
As reliable results are so important to safeguard patient health, traditional internal quality-control strategies have come under recent scrutiny due to limitations such as the lack of proven commutability of materials with patient samples, low frequency of testing, and cost.3 Developments in the field of patient-based real-time quality control could overcome some of these challenges.
Professor Busadee Pratumvinit is Deputy head of Department for the Department of Clinical Pathology at Siriraj Hospital where around seven million clinical chemistry tests are performed each year.4 The hospital has recently introduced patient-based real-time quality control procedures into their laboratory and Professor Pratumvinit provided an overview of the technique at Roche Experience Days (RED) 2024.
Article highlights:
- Quality control in laboratories is crucial to ensure diagnostic tests are accurate and reliable.
- Patient-based real-time quality control is a method that utilizes real patient samples to monitor assay performance.
- Implementation of the patient-based real-time quality control method can be complex but provides key benefits to complement traditional quality control methods.

Join our community and stay up to date with the latest laboratory innovations and insights.
“By definition, patient-based real-time quality control techniques use data from patient samples to detect analytical errors,” explains Professor Pratumvinit. This is different from traditional methods that use externally sourced materials to test equipment. The process of patient-based real-time quality control can be seen as complex as it requires detailed knowledge of the characteristics of the laboratory patient population such as age, sex, and clinical service, and how this may change during the day, week, or year.5
However, by utilizing patient data that comes from patients served by the laboratory a level of customization is possible which allows very sensitive detection of any changes in results from laboratory equipment.5 Statistical parameters individual to each laboratory such as population mean, median, or standard deviation of patient results are measured using specific analytical platforms.6 So long as the laboratory’s patient population does not change, any shift in results from the assay, such as mean population or standard deviation, represents a change in the assay performance.
The advantages of this technique are that samples are commutable, once set up it is inexpensive to run, and continuous monitoring is possible.6 Professor Pratumvinit notes a study from one laboratory that introduced patient-based real-time quality control and saw a reduction in traditional QC material required of 75-85%, and the need for repeat analysis was reduced by 50%.7,8
Professor Pratumvinit identifies three steps for implementing the method: system setting, optimization, and alarm protocol
The first step of system setting involves understanding the analytes. Professor Pratumvinit explains, “We have to know the distribution of our analytes in each day, or whether there is seasonal variation or any differences between specific clinics. We have to extract the data from at least one year to visualize the population in order to use it for statistical manipulation later.”
This also involves setting exclusion criteria by looking at available samples and deciding which to analyze. For example, those using nonparametric materials or extreme patient results that lie outside of expected limits are excluded from measurement.
The next step is selecting the calculation algorithm for monitoring results. For this, consideration should be given to whether to use mean or median, weighted or exponentially weighted moving average, and batch or continuous mode. Laboratories also need to decide on the block size used in the system. “The block size means that in each dot that we put on the glass, how many patient results are in there? If the block size is too small, it can lead to faster error detection, but it might reduce the sensitivity and specificity of the results,” says Professor Pratumvinit. Defining the control limit is the next step required so that if a result is found to be out of the set limit, laboratory staff are alarmed by the issue.
The process of system setting can be quite complex, and so where laboratories aren’t sure where to start, Professor Pratumvinit suggests using standard measures in line with The International Federation of Clinical Chemistry and Laboratory Medicine recommendations.8
Once the system has been set up, the next stage in the process is optimization. “The objective of optimization is to reduce false rejection and increase the detection rate,” says Professor Pratumvinit, “At first when we put the settings in the software it may cause high false rejection rates, and we will have to troubleshoot it unnecessarily, so we need to optimize it.” This can be performed by trial and error, where settings are adjusted one by one to see how the system responds. For example, changing the truncation limit, or block size. However, this can be a slow process, and there are now web-based programs available that can simulate the data and help laboratories find their optimal levels.
The final step in the process is to ensure that any error detected by patient-based real-time quality control is acknowledged and acted upon. When an error is flagged, the first step is to stop using the machine that has sounded an alarm and move the sample to a different machine. Laboratory staff must then determine whether an analytical error actually occurred, before troubleshooting based on their findings.
Due in part to the complexity of setting up and validating patient-based real-time quality control techniques, widespread uptake has been slow.9 However, Professor Pratumvinit believes laboratories should consider patient-based real-time quality control as it “can augment and complement existing QC programs.” The inclusion of patient-based real-time quality control has helped bolster traditional QC in their own laboratory, and it is able to detect analytical errors more quickly. Professor Pratumivinit concludes, “Overall it reduced time for error detection, reduced the cost of repeat testing, and eased staff workload.”
To hear more from Professor Pratumvinit about Siriraj Hospital’s experience with patient-based real-time quality control techniques in the lab click here to watch the full presentation at RED 2024.