
Full text loading...
Category: Applied and Industrial Microbiology; Clinical Microbiology
Biological Safety: Principles and Practices is now available on Wiley.comMembers, use the code ASM20 at check out to receive your 20% discount.
Now in its fifth edition, Biological Safety: Principles and Practices (Feb. 2017) remains the most comprehensive biosafety reference. A resource for biosafety professionals, instructors, and those who work with pathogenic agents in any capacity, Biological Safety is also a critical reference for laboratory managers and those responsible for managing biohazards in a range of settings.
Hardcover, 725 pages, full-color illustrations, index.
“The 1990s have been marked by a renewed recognition that our human species is still locked in a Darwinian struggle with our microbial and viral predators.” Although this unreferenced quotation was made by Nobel Laureate Joshua Lederberg, as he was discussing the acquired immunodeficiency syndrome (AIDS) and multidrug-resistant Mycobacterium tuberculosis epidemics that emerged in the early 1990s, his comment could also apply to almost any infectious disease process that has occurred since the recognition of the germ theory of disease in the late 1880s. For as we journey through the 21st century, and despite the advances of modern medicine and the continual development of new vaccines and anti-infective therapeutic agents, the human species continues to battle microbial predators in this Darwinian struggle for survival.
Laboratory animals have played a major role in advancing biomedical research and will continue to be important for identifying fundamental mechanisms of disease and exploring the efficacy and safety of novel therapies. The health status of these animals can have a direct impact on the validity and value of the research results, as well as on the health and safety of those who work with them. Husbandry practices are established in reputable laboratories to protect both the animals and the personnel that work with them.
Cross-kingdom or interkingdom pathogenic microorganisms are now recognized more commonly (1, 2). There are an increasing number of organisms, and occasionally even the same strains of an organism, that can colonize and/or infect both plants and humans. Pathogenic microorganisms demonstrating cross-kingdom host ranges may have been overlooked for several reasons. Taxonomic name changes may have resulted in misidentification. In addition, a different name may have been used to describe the same microorganism (2, 3). A primary reason for this failure to recognize nontraditional pathogenic microorganisms may be due to little earlier knowledge that some plant pathogens can grow at temperatures of body extremities and external parts of the body, which may be as low as 33.2°C (4). The thermal growth tolerance for most fungi is significant at 37°C or below (5). Fungi associated with human disease are speculated to have originated from asymptomatic or diseased plants (6). However, mammalian defense systems of innate temperature are a potent nonspecific defense against most fungi (5) and bacteria. As to classification of cross-kingdom microorganisms, Hubalek (7) suggested that emerging human infectious diseases could be grouped into those transmissible between humans (anthroponoses), those transmissible from animals to humans (zoonoses), and those transmissible to humans from the environment as sapronoses. However, the term “sapronoses” was presented as those diseases having an environmental reservoir (organic matter, soil, and plants). Other treatises define sapronoses as diseases whose source is only an abiotic substrate (nonliving environment). Hubalek (7) did not address diseases of humans transmissible from plants, but the term “phytoses” has been used in presentations by the Centers for Disease Control and Prevention (CDC) (R. V. Tauxe, personal communication), although “phytonoses” would be consistent with Hubalek's classification system (7).
Publications on laboratory-associated infections (LAIs) provide critical information for prevention strategies. The review of actual case studies illustrates the importance of adhering to biosafety protocols and may trigger changes in laboratory procedures. Singh has stated that it is time for a centralized system for reporting, analyzing, and communicating “lessons learned” about LAIs to be developed (1). Surveys on some subsets of laboratory workers, and case reports on individual LAIs, have been published; however, without a centralized system, it is impossible to assess the true incidence of LAIs. In addition, the underreporting of such infections is widely acknowledged due to fear of reprisal and the stigma associated with such events (2).
Biological risk assessment is a challenging process because the variables cannot always be measured quantitatively and subjective judgments must often be made. There is a complex interaction between agents, activities, and people in a constantly changing environment. Work with biohazardous agents, or materials suspected of containing such agents, needs to be assessed for the risk they pose to the individual, the community, and the environment. A biohazardous agent is an infectious agent or other substance produced by a living organism that causes disease in another living organism. Whether the work is performed at a research, clinical, teaching, or large-scale production facility, a risk assessment should be performed to provide the information needed to eliminate risk or reduce it to an acceptable level. The assessment of risk needs to be carried out by knowledgeable people using professional judgment and common sense. By using valid information about the specific agent and taking into account any additional risks posed by the specific procedures and equipment, the evaluator should be able to identify the most appropriate work practices, personal protective equipment, and facilities to protect people and the environment. A risk assessment should be done before work begins and should be repeated when changes are to be made in agents, practices, employees, or facilities. The risk assessment for work with biohazardous agents must take into account not only the agent but also the host and environment. This chapter focuses on agent- and activity-based risk assessments for general work not involving Select Agents. Host factors are addressed briefly, but they are more appropriately covered by occupational medicine.
Parasitic diseases have public health and clinical importance throughout the world, not just in developing countries. Some parasitic diseases are endemic globally (e.g., toxoplasmosis and cryptosporidiosis), and even those that are endemic primarily in developing countries or in the tropics and subtropics are receiving increasing attention in developed countries, in part because of their importance in returning travelers and immigrants. As clinical interest in and laboratory research about parasitic diseases increase and as the numbers of infected patients increase, so do the numbers of persons working in settings in which they could be exposed to parasites.
The number of fungal species is conservatively estimated to be 1.5 million, and at least 98,000 have been described formally (1). Although more than 300 of these are documented as causing disease in humans, only about 100 are encountered regularly as pathogens of humans. Virulence among these fungi varies, as do the entry portals through which they cause disease in the host and the manner in which they subsequently could spread. These various differences provide a convenient basis for broadly categorizing the mycoses, and they also help in delineating biosafety measures needed for the safe handling and storage of the fungi involved.
Bacterial endotoxins are heat-stable components of the Gram-negative outer membrane that are released after a bacterial cell lyses and are also continuously shed by viable bacteria. Endotoxins cause systemic inflammatory responses and are associated with sepsis as well as chronic inflammation (1). The outer membrane of Gram-negative bacteria is covered up to 75% by lipopolysaccharides (LPSs); the remainder consists of proteins that primarily serve as channels for the entry and exit of molecules or as structures that mediate interactions with the environment (2). A LPS is composed of three parts—lipid A, core polysaccharide, and a glycan, typically O antigen. Lipid A is the ligand that stimulates inflammation by binding to receptors on the host cell surface as well as intracellular membranes and cytoplasm (3). Many bacterial pathogens, such as Francisella, Yersinia, Brucella, and Coxiella, synthesize an alternative form of lipid A that is less stimulatory in the mammalian host and significantly dampens the early host inflammatory responses, thus facilitating the establishment of disease.
This chapter discusses the unique hazards of handling biosamples that contain viruses and the illnesses that can result from occupational exposure and infection. Basic concepts of virology and clinical syndromes of viral infections are reviewed, with specific attention dedicated to viral pathogens associated with a risk of work-acquired infections. The last section addresses laboratory safety challenges presented by newer biotechnologies. Armed with the knowledge presented in this chapter, the laboratorian will be able to define a risk management matrix based on the risk of a virus in a specimen from a patient manifesting a specific illness, the types of procedures used in the laboratory, and the potential outcome of a laboratory-acquired infection (LAI).
Accompanying the development of molecular biological tools for identifying gene sequences and functions has been the development of novel gene transfer vectors for shuttling gene sequences between different organisms. As illustrated in Fig. 1, a search of the PubMed database reveals a large number of publications describing the application of frequently used recombinant viruses. The use of these vector systems has continued to increase significantly since the publication of the 4th edition of this book in 2006. This increase has been driven by several factors, including the proliferation of gene therapy protocols (Fig. 2), the increased recognition of the general utility of viral vectors as gene transfer agents for elucidating and studying gene function, and the increased commercial access of the reagents required to produce these viral vectors. Additionally, our understanding of the molecular biology of a wide variety of virus families has increased to the point where novel chimeric viruses, containing the unique properties of two or more viruses, are being constructed routinely. Due to their unique nature, these viruses pose special challenges in risk assessment.
Biological toxins are poisonous by-products of microorganisms, plants, and animals that produce adverse clinical effects in humans, animals, or plants. A toxin is defined as “a poisonous substance that is a specific product of the metabolic activities of a living organism and is usually very unstable, notably toxic when introduced into the tissues, and typically capable of inducing antibody formation” (Merriam-Webster OnLine, http://www.merriam-webster.com/dictionary/toxin). Biological toxins include metabolites of living organisms, degradation products of dead organisms, and materials rendered toxic by the metabolic activity of microorganisms. Some toxins can also be produced by bacterial or fungal fermentation, the use of recombinant DNA technology, or chemical synthesis of low-molecular-weight toxins. Because they exert their adverse health effects through intoxication, the toxic effect is analogous to chemical poisoning rather than to a traditional biological infection.
Molecular agents are some of the most challenging types of agents encountered in biosafety in terms of assessing risk and determining the appropriate containment levels. Molecular agents are often found on the cutting edge of science. As such, there are many unknowns, and they do not fit into discrete risk group categories. This chapter discusses some of the major categories of molecular agents that may be encountered in biological safety. Various types of nucleic acids are described, including recombinant, nonrecombinant, oncogenic, and pathogenic DNA. Synthetic, naked, and free nucleic acids are also discussed. Gene transfer techniques are mentioned, and a review of RNA technologies is provided, including a discussion of different types of RNA interference, such as small interfering RNA (siRNA) and microRNA (miRNA). Innovative molecular tools for genome editing and cancer immunotherapy are described, namely zinc finger nucleases, transcription activator-like effector nucleases (TALENs), meganucleases, clustered regularly interspersed short palindromic repeats (CRISPRs), chimeric antigen receptors (CARs), and engineered T-cell receptors (TCRs). Nanotechnology and how this new field relates to biosafety are discussed. Finally, a review of biosafety issues related to prion diseases is given. For each of the topics, a general description of the technology is provided and biosafety considerations for risk assessment and containment are presented.
For some pathogenic microorganisms, the airborne route is the predominant means of transmission to humans. These agents, which may be transmitted from humans, animals, and the environment, i.e., soil and water, include certain pathogenic viruses, bacteria, and fungi. Although certain species of fungi and mycobacteria share the airborne route of transmission, they are very different in substantive elements of transmission, including their natural habitats and reservoirs. It is only with a clear knowledge of these differences that the laboratorian can adequately perform a risk assessment and implement appropriate safety protocols to abate those hazards in the laboratory.
Animal cells have been used in biotechnology since the early 1950s. The Salk polio vaccine, licensed in 1954, was the first product produced on animal cells as a substrate, and for many years the only products produced using animal cells were viral vaccines. Primary animal cells were used for many years for vaccine production and are still used in certain cases. These vaccines have generally proven to be acceptable and safe, but there are notable exceptions that have directed manufacturers and regulatory bodies to be very cautious in their assessment of new cell substrates. The earliest cell lines used to manufacture biological products were human diploid fibroblast finite cell lines. WI-38 and MRC-5, two of the best-known examples, have been used in the manufacture of a number of licensed products (Table 1). The early use of continuous cell lines (CCLs) for the manufacture of biological products is represented by the manufacture of foot-and-mouth disease vaccine in the Syrian hamster cell line BHK, the production of interferon from the B-lymphoblastoid cell line Namalwa, and the introduction of monoclonal antibodies from hybridoma cells in the 1990s (for a general reference, see reference 1).
Allergy to laboratory animals is a significant occupational hazard and among the most common conditions affecting the health of workers involved in the care and use of research animals. At least 90,000 workers in the United States have direct contact with animals in research or industrial facilities, although some sources estimate 40,000 to 125,000 (1, 2). Workers who are in regular contact with furred animals often develop sensitivity to these animals. This sensitivity accounts for the high prevalence of laboratory animal allergy (LAA) in animal workers, and cross-sectional studies have estimated that as high as 44% of individuals working with laboratory animals report work-related symptoms (3, 4). Veterinarians are also at risk with similar levels of allergy development with symptoms (5). Of these symptomatic workers, up to 25% may eventually develop occupational asthma that persists even after the exposure ceases (6). This high prevalence rate has major medical and economic implications. When employees develop LAA, it often results in significant morbidity, at times necessitating a change in occupation. In addition, it may lead to reduced productivity, increased workloads for others, and increased health and worker's compensation costs for the employer. Due to recent awareness and increased surveillance and monitoring, recent studies have actually seen a decline in occupational asthma over the last 25 years (7), but recent studies suggest vigilance is necessary (8). Familiarity with LAA, including its clinical characteristics, etiology, pathophysiology, treatment, and preventative measures, can be vital in reducing the economic and physical impact of this important occupational hazard.
World circumstances have changed the small, simple biocontainment facilities of the past into larger, more complex facilities with difficult design decisions. There is not “one way” to design any laboratory; therefore this chapter provides both laboratory users and designers with relevant information to assist in making choices appropriate for the needs of specific projects. If the architect and engineers make decisions without local input and informed consent, it is unlikely that the completed laboratory will be satisfactory. The design of biomedical research laboratories, particularly biocontainment laboratories, is an exercise in making choices that are often between competing ideas and needs. However, if the potential users become an active, integral part of the process and an experienced design team is engaged, the facility will likely meet current needs and future requirements. Competent professional assistance is a necessity in this design process.
Primary barriers are both techniques and equipment that guard against the release of biological material; they may also be referred to as primary containment. In general, they provide a physical barrier between the worker and/or the environment and the hazardous material. Primary barriers range from a basic laboratory coat to a biological safety cabinet (BSC). This chapter addresses some of the more common primary containment devices and personal protective equipment (PPE) and a variety of equipment-associated hazards. Other chapters cover respiratory protection, work practices, and BSCs that are more specific examples of primary containment.
Deadly disease outbreaks have become a frequent occurrence throughout the world with infectious agents such as human immunodeficiency virus (HIV), hepatitis B virus, severe acute respiratory syndrome-related coronavirus (SARS-CoV), Middle East respiratory syndrome coronavirus (MERS-CoV), Nipah virus of Malaysia, Hendra virus of Australia, hantavirus in the United States, and most recently, the Ebola virus in Africa (1). Additionally, new laboratory techniques have become reliant on the use of infectious agents for common procedures. Lentivirus, adenovirus, vaccinia virus, Escherchia coli, and human cancer cells are frequently found in research laboratories worldwide. With the increased exposure to these biological agents comes the greater risk of developing a laboratory-associated infection (LAI). Previous studies have accounted for 5,527 LAIs from 1930 to 2004, with 204 of these resulting in death (2–4). In a recent study of all LAIs from 1976 to 2010, it was found that there were 197 cases reported to the National Institutes of Health (NIH) due to exposure to specifically recombinant DNA-based materials (5). Unfortunately, most LAIs (82%) cannot be traced to a single incident to determine the cause of exposure (3, 5–7). Although good sterile and aseptic techniques are critical, virtually every activity in the laboratory gives rise to aerosols (8–10). Aerosols containing infectious agents, compounded by contact spread (11), could create an epidemic before any symptoms present. This underscores the need for protection from these agents, such as use of proper aseptic technique, personal protective equipment (PPE), and appropriate primary barriers.
The recent emergence and reemergence of arthropod-borne viruses (arboviruses) such as chikungunya and Zika viruses, which are transmitted by mosquitoes, highlights the need to increase the capacity to conduct research on these pathogens and the vectors that are involved in the transmission cycles. Over the past two decades, the United States has had several arboviruses introduced, highlighting the need for more facilities and researchers to study these viruses. The introduction of West Nile virus (WNV) into the United States in 1999 (1–4) revealed a lack of suitably trained entomologists/virologists who could conduct critical surveillance operations and fieldwork essential for effective targeting of vector control programs. It also highlighted the erosion of training and educational material for entomology related to public health (5). Following the introduction of WNV, the capacity for surveillance and research increased in some areas; however, there is a need to continue support for these programs to quickly control new introductions of diseases transmitted by arthropods.
The control of microbial aerosols is the major driver in the design of microbiological containment laboratories. The provision of a negative-pressure laboratory area with a high efficiency particulate air (HEPA) filtered exhausted ventilation system is intended to prevent the escape of infectious aerosols from the facility. The use of directional airflow within open-fronted safety cabinetry is designed to prevent the release of any aerosols from the working area of the cabinets. Class III safety cabinets and isolator systems provide physical barriers between the operator and activity while maintaining negative pressure and high airflows, with HEPA filtration to prevent the release of aerosols. As a last resort, respiratory protection is used to prevent the exposed worker from inhaling the infectious agent. Yet, the average microbiologist may have only a limited understanding of the processes that generate aerosols in the laboratory and may have little knowledge of how effective preventative equipment and processes are.
Respiratory protection is used when workplace air is unsuitable for breathing due to lack of oxygen or unsafe levels of contaminants. Respirators are designated as a last resort or temporary control measure to help reduce contaminant exposures in the workplace to acceptable levels or provide sufficient oxygen for breathing. In accordance with the industrial hygiene hierarchy of controls, available engineering and administrative controls should be implemented before considering personal respiratory protection as a control measure. When necessary, only respirators certified by the National Institute for Occupational Safety and Health (NIOSH) should be used in the United States. A full respiratory protection program administered by a trained individual as specified by the Occupational Safety and Health Administration (OSHA) must accompany any use of respirators in the workplace. A respiratory protection program is necessary to ensure safe and proper use of respirators and to help avoid misuse or injury or death to the respirator users. Important components of a program include written standard operating procedures (SOPs), medical evaluation, user training, respirator maintenance procedures, and properly fitting the respirator to the user. The program must have a designated and knowledgeable administrator, preferably someone trained in a field of occupational health and safety.
A broad range of infectious agents can be found in the blood at different stages of infection in humans. Most agents are present at high levels during a brief amount of time (i.e., septicemic phase), rarely are transmitted by blood, and therefore, are not usually categorized as “blood-borne” pathogens. Some agents, particularly viruses that induce a latent-phase or long-term carrier state, can be transmitted to other humans through blood or body fluid contact. The three most common examples of viruses existing in long-term carrier states that frequently exist as an asymptomatic infection are human HIV-1, hepatitis B virus (HBV), and hepatitis C virus (HCV). Occupational infections with these blood-borne pathogens have been documented globally and can occur when blood or body fluids containing these agents are transferred directly to the worker, e.g., through needlestick exposures to contaminated needles or blood or body fluid contact with mucous membranes or nonintact skin. The study of how these infections occur provides insight into the risks associated with other blood-borne pathogens that may be present in a carrier state in the blood such as Plasmodium (malaria), West Nile virus, Treponema pallidum (syphilis), or viral hemorrhagic fever viruses (e.g., Ebola).
To protect laboratory workers, the general public, and the environment, as well as to avoid release of infectious agents into the environment, laboratories use a combination of work practices and engineering controls, including decontamination strategies for work surfaces, items, and spaces within the laboratory, to mitigate this risk. “Decontamination” is a general term that usually refers to a process that makes an item safe to handle, or a space safe to occupy, and can include processes ranging from simple cleaning with soap and water to sterilization. This chapter discusses the factors necessary for environmentally mediated transmission of infection to occur and methods for decontamination (which includes cleaning, disinfection, and sterilization). Emphasis is placed on the general approaches to decontamination practices and not on the detailed protocols and methods. The principles of sterilization and disinfection are discussed and compared in the context of the decontamination procedures used in laboratories.
Laboratory workers who ship or transport dangerous goods, in general, and diagnostic specimens and infectious substances, in particular, by a commercial land or air carrier are required to follow a complex and often confusing set of national and international regulations and requirements. The purpose of these regulations and requirements is to protect the public, emergency responders, laboratory workers, and personnel involved in the transportation industry from accidental exposure to the contents of the packages (1–3).
The past few decades have seen a consistent evolution of approaches to safety and security risk management across a diversity of industries. In their review of this evolution, a U.S. National Academies of Sciences panel reviewing safety culture in academic chemistry laboratories (1) summarized, from safety science literature, three “epochs” that arose in response to accidents: (i) technology, (ii) systems, and (iii) culture.
The purpose of an occupational medicine program is to promote a safe and healthy workplace through the provision of work-related medical services. In a biomedical research setting that involves biohazardous materials, those services should include a preplacement medical evaluation, job-specific counseling and immunizations, and a practical plan for responding to suspected exposures to workplace health hazards and caring for work-related injuries. Before discussing these core elements of an occupational medical program, a review of the prerequisites for these services is in order.
A biosafety program consists of many components, which are determined by the type of research being conducted at the institution as well as by current regulations and guidelines. A biosafety program may include oversight of blood-borne pathogens, research involving infectious materials or recombinant or synthetic nucleic acid molecules (rDNA), biosafety cabinet (BSC) certifications, and/or high-containment laboratories and select agents. Many institutions will not have all elements, but a means is needed to evaluate the success of each component present.
Training prepares people to behave, and it is behavior that connects plans to desired outcomes. The four phases of biological risk mitigation are (i) risk identification, (ii) risk assessment, (iii) risk management, and (iv) risk communication. During the risk identification process, both the agent and processes of working with the agent are reviewed to determine risk, which is assessed and managed primarily through the development of standard operating procedures (SOPs). However, the greatest risk that is often overlooked is the people interacting with agents on a daily basis. How these individuals perceive laboratory risks—their experiences, educational levels, comfort, skills—and the culture of the organization in which they work influence safety attitudes and behaviors. SOPs provide the process to achieve the desired outcome and training ensures consistency of behavior among many individuals with vast differences in education and experience.
Biological agents have been documented as instruments of warfare and terror (bioterrorism) to produce fear and harm in vulnerable and susceptible populations for thousands of years. The ultimate goal for those using these agents was to inflict harm upon selected individuals or the general human population as well as upon animals and plants (1, 2). The Federal Bureau of Investigation (FBI) defines terrorism as the “unlawful use of force against persons or property to intimidate or coerce a government, the civilian population, or any segment thereof, in the furtherance of political or social objectives” (3). Basically, bioterrorism is a form of biological warfare. Biological warfare is the intentional use of etiologic agents, such as viruses, bacteria, fungi, or toxins derived from living organisms, to produce death or disease in humans, animals, or plants (4). An etiologic agent is “a viable microorganism or its toxin that causes or may cause human disease, and includes those agents listed in 42 C.F.R. 72.3 of the U.S. Department of Health and Human Services (DHHS) regulations and any material of biologic algorithm that poses a degree of hazard similar to those organisms” (5). A toxin, also included as an etiologic agent, is defined as “toxic material of biologic origin that has been isolated from the parent organism. The toxic material of plants, animals, or microorganisms” (6). Potential agents that could be used in a bioterrorist event include those causing the diseases anthrax (Bacillus anthracis), plague (Yersinia pestis), tularemia (Francisella tularensis), the equine encephalitides (Venezuelan equine encephalitis and eastern equine encephalitis), hemorrhagic fever viruses (arenaviruses, filoviruses, flaviviruses, and bunyaviruses), and variola virus (smallpox). Some of the toxins that could be used in a bioterrorism event include botulinum toxin from Clostridium botulinum; ricin toxin from the castor bean Ricinus communis; the trichothecene mycotoxins from Fusarium, Myrothecium, Trichoderma, Stachybotrys, and other filamentous fungi; staphylococcal enterotoxins from Staphylococcus aureus; and the toxins from marine organisms such as dinoflagellates, shellfish, and blue-green algae. The list of potential etiologic agents is quite extensive (7). However, the list of agents that could cause mass casualties by the aerosol route of exposure is considerably smaller (8–17).
Any discussion of biological safety within undergraduate basic science and clinical teaching laboratories should be prefaced by several important caveats. First, the foundation of any science laboratory experience should include a solid understanding of safety. Practicing safety teaches responsibility and respect for life and property. Regardless of unique institutional conditions and oversight agencies, adherence to common safety and biosafety practices demonstrates a good-faith effort to students, parents, faculty, administrators, and even accreditors, that academic laboratory users are valued. Additionally, most safety practices were developed in response to documented need and thus serve to mitigate hazards. At its core, the teaching laboratory should be an environment where students are challenged by the science of microbiology, not by fear of infection. Furthermore, as the science of microbiology evolves, time-tested biosafety practices continue as a microbiology legacy, passed down to subsequent generations of microbiologists as résumé staples.
Pharmaceutical companies that employ pathogenic microorganisms to produce vaccines and sometimes pharmaceuticals must establish a broad range of biosafety practices to ensure the safety of their employees as well as their products. At the drug discovery stage, especially during the search for candidates from natural sources, these safety practices must allow the research laboratories to cultivate myriad microorganisms, many of which are initially unknown. During scale-up, the biosafety practices employed should be in harmony with international guidelines to ensure that the manufacturing process and product may be implemented and sold, respectively, in other countries. Because the biosafety concerns experienced in pharmaceutical research laboratories are quite similar to those discussed in earlier chapters, they will not be repeated here. Therefore, this chapter briefly addresses the biosafety challenges commonly experienced in cultivating recombinant and pathogenic microbes. The use of mammalian cells for the production of therapeutic proteins and viruses will also be addressed.
The notion of scale-up or large-scale processing of microorganisms is currently associated with recombinant DNA (rDNA) technology, but in fact it has been common practice for many years. Microorganisms have been scaled up for the manufacture of foods and beverages for centuries. In the past hundred years, the large-scale production of antibiotics, vaccines, and biological products has become commonplace. The relative numbers of laboratory-acquired infections from the production environment are extremely low, approximately 3.4% of the total numbers documented (1). Part of the reason for these low numbers may be the reduction in virulence of the cultured organism, but they are most likely attributable to the extensive use of primary and secondary containment barriers, i.e., containment equipment and facilities, which are generally required to maintain the integrity of the product.
Similar to human clinical microbiology laboratories, the work performed in veterinary diagnostic laboratories has inherent risk to laboratory workers. According to the 2008 American Veterinary Medical Association “One Health Initiative” task force report (www.avma.org/onehealth), 60% of infectious diseases in humans are due to multihost pathogens that move across species lines (1). Over the last 30 years, 75% of the emerging human pathogens (e.g., West Nile fever, avian influenza, Lyme disease) have been zoonotic (transmitted between animals and humans) (2). Thus, veterinary diagnostic laboratorians are at risk for laboratory-acquired infections (LAIs) from multiple host species pathogens.
The food and agriculture industry in some countries is often a concentrated, highly accessible, vertically integrated, global, and complex system that relies on a sophisticated agricultural infrastructure. These characteristics make some agricultural systems very productive and efficient; however, these same qualities make this industry inherently vulnerable to foreign/transboundary animal, emerging, and zoonotic disease outbreaks that could threaten the stability of the economy, food security, and the nation’s public health. Thus, there is a continuing need to ensure that basic and applied research in agricultural biosafety and biosecurity be adequately supported to ensure that the agricultural system remains productive, economical, and, most of all, safe (food security).
Biosafety evaluations of plant research require the assessment and analysis of the plants alone as well as of the biological organisms associated with the plants, either naturally or introduced in planned experiments. Hence, the term “plant” refers to both the plant and its associated biological organisms. Plant research discussed in this chapter is conducted in specialized facilities that allow for plant growth and manipulation, collectively referred to here as containment facilities. Such facilities may be greenhouses, growth chambers, and modified laboratories that serve as places for growing plants under controlled conditions. Some of these facilities are specialized to isolate plants from biotic risks in the environment or to control fluctuations in abiotic or environmental factors. Although a particular plant can exist in an environment with wide variability in ambient temperature, light, nutrition, and other essential growth components, environmental conditions must be controlled to ensure scientific reproducibility. It is generally accepted that reducing variability, in this case by controlling environmental conditions, results in better scientific predictability. Furthermore, these actions enable other researchers to reproduce the experiments.
Discussions of biosafety in the laboratory setting can be found in numerous texts, including other chapters in this book. Several published guidelines exist, with the Biosafety in Microbiological and Biomedical Laboratories (BMBL) (1) text serving as the general authority. There are a wealth of guidelines available for biosafety in the laboratory, but only a handful of references exist regarding biosafety considerations for individuals performing fieldwork, i.e., work done outside a laboratory with materials containing or potentially containing infectious agents, and there is no formal text describing structured risk assessment strategies for fieldwork that either incidentally or intentionally involves contact with zoonotic agents that have pathogenic potential. This text is meant to serve as a reference for existing guidelines, as well as a tool that can be used to help determine what risk levels exist for a planned field activity and how those risks may be mitigated. Protection of fieldworkers should be the prime focus of both supervisor and the workers themselves.
The risk of exposure to infectious agents exists in every clinical laboratory; however, the goal of every clinical laboratory should be to minimize that risk and conduct its activities as safely as possible. To achieve this, a strong culture of biosafety must be in place. The biosafety culture depends on the opinions, beliefs, views, and feelings of all the laboratory staff. When there is a strong culture of biosafety, then every employee accepts responsibility and is accountable to maintain biosafety practices that protect both the employee and coworkers. The major question is how to achieve a strong culture of biosafety. Management has the responsibility to build and sustain that culture. This chapter will explain the required components.
The number of biosafety level 4 (BSL4) maximum-containment laboratories (facilities) worldwide has increased significantly. In the early 1980s, only two such laboratories existed in North America, one at the Centers for Disease Control and Prevention (CDC) in Atlanta, GA, and the other at the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID), Fort Detrick, MD. By early 2005, there were at least six operational BSL4-capable laboratories in the United States and over a dozen worldwide (1). As of September 2011, there are at least 13 operational or planned BSL4 facilities within the United States (see also http://fas.org/programs/bio/research.html#USBSL4). Canada also has operational BSL4 laboratories in Winnipeg, Manitoba, for the study of both human and animal disease agents. Worldwide there are at least 27 operational BSL4 facilities (2).
Full text loading...