Risks of Environmental Monitoring: Microbes aren’t your only threat

Published: 11-May-2020

Microgenetics has released a new whitepaper explained the risks of environmental monitoring when using a legacy system

In the highly regulated world of pharmaceutical manufacturing, environmental monitoring is essential for patient safety. However, the process is not without risks of its own; in labs using paper and Excel or other legacy systems, even your best efforts can leave your lab open to risk, the consequences of which range from heavy fines to even something as priceless as a patient's life.

The process

If you are still using legacy systems, your monitoring process may look something like figure 1. If this is the case, each stage presents different challenges. Let's start at stage 3, where your plates have been moved from your cleanroom to the lab and your data will be transcribed - often onto a paper record.

If you are familiar with the ALCOA+ principles of data integrity - that being data should be accurate, legible, contemporaneous, original and accurate, then complete, consistent, enduring, available for the + symbol - then you might have already spotted the issue. By transcribing onto paper your record is not original, as it will have to be re-transcribed to your system. Additionally, the record is open to human error, risking inaccurate data and if your handwriting is poor or rushed, it may not be legible. Paper can also easily be lost or misplaced and isn't a permanent record. You can see where we are going with this - even at this early stage the integrity of your data can already be questioned.

Once your data is recorded, your plates will be incubated and then read. When this stage is reached, there are two main risks; firstly, a count risk, especially where growths are small, non-contrasting, or occluded. This is particularly problematic if the count is near a threshold; for example, mis-counting a single CFU in a grade A environment could mean a limit breach is missed. Secondly, limit confusion.

An under-pressure person trying to recall the settle plate alert limit for a particular room can make mistakes - especially at the end of a long shift. The good news is that some of this risk can be reduced by second checking, thus catching discrepancies, as well as by photographing plates to give an enduring audit trail.

When you have counted colonies you will then need to record the data which carries the same ALCOA+ risks as the first transcription stage, with additional association risks that may occur if the count is recorded against the wrong sample record. This particularly highlights the issues of not using the original record and/or unique identifiers, which would otherwise mean these multiple transcription stages are not required.

Once you have recorded your data, you may then need to carry out an investigation. Investigations should be thorough and identify the root cause of contamination promptly. However, this can be a challenge for legacy labs, who lack easy access to the correct information or the tools to analyse. As a result, investigations may be delayed or lead to inadequate action, and at its most extreme the underlying event(s) may have passed and conditions changed by the time a conclusion is reached, missing an opportunity to achieve proactive control.

Once you have investigated, your findings will need to be transcribed from your paper records to Excel or a similar system. You might think the biggest danger here is unmotivated staff due to the monotonous nature of data entry, but if you consider the ALCOA+ principles, you will see that at this stage, every principle is broken; for example, data is not attributable to the individual who first recorded it and will not correspond with the date it was first recorded, so is not contemporaneous.

Multiple transcription stages decrease the accuracy of data, and on Excel, a true audit trail cannot be created.

Once you have transcribed your data, you will then analyse the results in the trending stage; this relies on the ability of staff to be able to identify and understand trends and be competent with Excel or the system used - not many Microbiologists are data scientists too! There are also limitations with the tool, for example, to predict future trends - making the lab reactive rather than proactive. If a trend is identified, it can be difficult to dive below the surface to identify contributing risk factors.

The consequences

When you consider the risks discussed above, you may feel it can be hard to trust the data. But what are the consequences? Internally, QC labs are sometimes seen by the wider business as lacking added value. Limited insight and responsiveness to the business needs due to information not being readily available contributes to this view, resulting in pressure from senior executives to enhance performance. Many Quality professionals consider economic performance a goal, however, for labs using legacy systems with multiple time-consuming transcription stages, this will prove a difficult goal to achieve.

Externally, consequences may be more severe. With the number of warning letters and statements of non-compliance with GMP citing data integrity alone on the rise, labs face increasing external scrutiny, heavy fines and in some circumstances, temporary suspension of manufacturing licenses or even licence removal. Ultimately though, the consequences stretch far beyond impact to business; by being unable to identify breaches and react to trends, patient safety is at risk. In extreme circumstances, contamination could result in a patient becoming seriously ill, or even death. Minimising errors and risks are therefore of upmost importance.

If you are wondering what you can do to reduce these risks, or would like further detail, click here to read the full whitepaper.

Trending Articles

You may also like