Endotoxins—those fever-inducing, sometimes life-threatening compounds—have haunted the pharmaceutical industry for over a century, since their discovery.
These potent substances, mainly bacterial endotoxins or lipopolysaccharides (LPS) from the outer membranes of gram-negative bacteria, are notorious for triggering severe reactions like fever and endotoxic shock when introduced into the human body. Their presence in drugs, medical devices, or lab consumables demands thorough screening to ensure safety.
Over the years, testing methods have evolved dramatically, moving from crude animal-based tests to cutting-edge synthetic alternatives that promise a future with less reliance on animals and more sustainability.
Endotoxin testing, in its earliest days, looked quite different from today’s sophisticated methods. Back in 1912, scientists E.C. Hort and W.J. Penfold pioneered the first approach, using live rabbits to monitor the onset of fevers. It wasn’t until 10 years later, thanks to the work of Florence Seibert, that these fevers were definitively linked to bacterial contamination.
Rabbits were injected with sterile solutions, and their temperatures were measured
By 1942, this basic idea had evolved into a more formalised technique—the Rabbit Pyrogen Test (RPT).
In this test, rabbits were injected with sterile solutions, and their temperatures were measured at timed intervals to detect potential pyrogens. This test dominated the field for over four decades, becoming the gold standard for assessing the safety of intravenous fluids and medical devices.
The 1960s brought a breakthrough that changed the game forever.