Laboratory operators will understand that the quality of the research they conduct will, at least in part, often be highly dependent on the standard of the materials, methods and processes they follow. Even the most robustly conceived research project can be fatally undermined by a lack of due attention to quality control during the execution of the study.
This will not come as a revelation to those in charge of operating controlled environments, most of whom will have invested significantly in procuring the best available personnel and equipment to support their research efforts. However, that does not mean that oversights cannot happen, or that small overlooked variables cannot have a major impact on the overall outcome and replicability of a study.
Laboratory water can be categorised into four types
Water purity is one of the most important of these variables, and labs need to be extremely diligent about ensuring that the water they use in their experiments and analyses adheres exactly to their required parameters of purity, or else run the risk of skewed results, damaged samples and additional costs. Only by investing in the right analytical tools will you be able to ensure that the lab-grade water you are using meets the needs of a controlled environment.
The need for reliable water purification
Water is one of the most ubiquitous reagents used in modern laboratory processes, with the vast majority of experiments and research projects carried out in the life science sector relying on a supply of purified water to serve multiple roles and functions.
Lab-grade water acts as a universal solvent for a wide range of different substances, as well as the basis for a whole host of biochemical reactions that can only be triggered in aqueous settings. It is also used as a means of cleaning key pieces of glassware and other equipment that need to be kept free of impurities and contaminants in order to prevent the research results from being influenced.
Laboratory water can be categorised into the following four types:
Tap water, also known as raw water or potable water. This is the most basic form of water that has not been previously treated or purified for laboratory use, and as such is likely to contain a whole range of contaminants and impurities, from dissolved ions, salts and minerals, to stray microorganisms and organic compounds. This type of water is basically never used in experiments, and will only ever serve as the feed water for creating laboratory-grade water
Type III water, also known as primary grade water and RO water, is purified using carbon filtration and reverse osmosis (RO) methods, removing up to 99% of the contaminants from the feed water. This is generally used for autoclave feed processes, glassware cleaning, media preparation or to feed type I lab water systems.
Type II water, known as deionised or general laboratory-grade water, is refined further with a combination of RO and ion exchange methods to offer a resistivity of 1-15MΩ-cm at 25°C. This is pure enough to be used for applications such as solution preparation, microbiological analysis, electrochemistry and general spectrophotometry.
Type I water, also known as ultra-clean or ultrapure water, offers the highest 18.2 MΩ-cm resistivity at 25°C, thanks to ultrafiltration processes and UV application to eliminate bacteria and organic contaminants. This ultrapure water is needed for highly sensitive applications such as cell and tissue cultures, high performance liquid chromatography (HPLC), mass spectrometry and molecular biology.
Identifying and using the right type of water at the correct level of purity is essential for laboratories to deliver high-quality, reliable results. The methods used to filter and purify water supplies used for these highly specific applications also need to be carefully analysed, quality-checked and verified to make sure that the required standards are being met.
Water impurity impacts
Failing to maintain the proper water purity standards within a controlled laboratory environment can have serious consequences, with the potential to significantly undermine the quality of your results or render an entire experiment worthless. In more serious cases, it can result in significant costs for laboratory operators over time.
Here are the main reasons why it is so important to make sure you are using properly purified water in your laboratory processes:
Poor water quality will skew your data - laboratory experiments and analytical processes are often highly dependent on measuring biological and chemical processes to a precise degree of sensitivity, including detecting impurities in a sample at trace and ultra-trace concentrations. If the water used in the process introduces additional impurities, this could dramatically skew the accuracy of the final results, leading to misleading conclusions and results that cannot be reproduced.
Water Samples for Testing
Impure water can damage your samples and equipment - the introduction of additional chemicals, organic agents and other contaminants to your laboratory processes can cause real damage. You could unwittingly end up contaminating a cell culture, fouling media or interfering with microbial growth, while chemicals found within the water could also interfere with or cause damage to highly sensitive lab equipment, such as chromatographs and spectrophotometers. As such, using the wrong type of water can have a lasting impact on your lab's performance.
Having to repeat processes due to impure water is expensive - even if significant damage is avoided, the need to repeat experimental processes because of water impurities can be a costly business. Not only does this consume unnecessary time and energy, but it also means that media needs to be replaced at a faster rate and equipment will be worn down quicker; it may also result in the creation of additional hazardous waste, making this an environmental issue as well as a problem of cost-effectiveness.
Given the importance of the research being conducted by the laboratory sector, the delays that this can cause will have real-world consequences. Problems with water impurity can slow down essential research into climate change or delay the development of lifesaving drugs. After a year in which the importance of efficient medical research processes has come into the global spotlight like never before, it has never been more crucial to ensure that all labs are following best practices when it comes to water purity.
The right analytical tools
There are a wide range of tools and methods available for purifying water for use in controlled environments, ranging from traditional distillation to microfiltration and ultrafiltration, as well as UV radiation, deionisation and reverse osmosis. All these processes have been proven to be effective - but it will often still be necessary to perform quality checks to make sure that the water has been filtered effectively in all cases.
This is where it becomes essential for laboratories to also invest in quality assurance methods to analyse the composition of the water they use in a way that is efficient, reliable and authoritative. This is where total organic carbon (TOC) analysers come into play, as these are among the best-established systems for screening water samples from multiple sources, and provide an excellent non-specific indicator of water quality.
TOC analysers are useful for screening water samples from multiple sources
By assessing carbon concentrations within a water sample, TOC analysers can detect chemical impurities, organic substances and other contaminants down to trace levels, allowing users to verify the cleaning efficiency and prevent the use of insufficiently purified water in key lab processes. TOC analysers can be configured for a wide range of desired sensitivity levels, with the most sensitive tools suitable for analysing ultrapure process water with very low carbon concentrations.
These systems have been widely used by laboratories for many years, and their performance and efficiency for these purposes has been verified in countless real-world usage cases. Moreover, the development and design of these tools continues to evolve, with the latest systems offering highly automated performance and the ability to analyse dozens of samples at the same time with little manual input or maintenance required.
By making full use of TOC analysis methods, laboratories will be able to guarantee that they are using only the purest water samples in their high-sensitivity chemical analyses. This, in turn, will help them improve the accuracy and efficiency of their research - and, ultimately, deliver key research breakthroughs for the public good. In this respect, water purity may seem like a small variable, measured in minuscule increments, but by getting this right, laboratories can make a big difference.