
Full text loading...
Category: Applied and Industrial Microbiology; Food Microbiology
Ebook: Choose a downloadable PDF or ePub file. Chapter is a downloadable PDF file. File must be downloaded within 48 hours of purchase
Growing public concern over the safety of our food supply has fueled the research and development of new methods to detect foodborne pathogens as quickly and as early as possible. This reader-friendly reference examines the latest proven rapid foodborne pathogen detection methods currently used. Organized by food commodities, this unique book enables readers to choose the most effective and efficient method, assemble the necessary resources, and implement the method seamlessly, avoiding common pitfalls.
Rapid Detection, Characterization, and Enumeration of Foodborne Pathogens is organized into seven sections. The first two sections review the latest laboratory technologies designed to accelerate test results, and explain the issues that labs need to consider in order to effectively implement rapid detection methods. The next four sections are organized according to commodity and food production lines, enabling readers to easily find the best pathogen detection methods for their needs. For each food line, the book sets forth the rapid methods that can detect important target pathogens. The final section looks to the future, detailing research needs and emerging areas of rapid detection of foodborne pathogens.
More than 85 experts from worldwide research centers provide guidelines for faster, user-friendly, and cost-effective foodborne pathogen detection. Their advice is based not only on a thorough review of the current literature, but also their own first-hand laboratory experience. As a result, readers can confidently turn to this book to minimize the risk of pathogen-contaminated foods reaching consumers.
Hardcover, 443 pages, illustrations, index.
This chapter covers the basic arguments for why individuals are increasingly turning to rapid methods to meet the microbiological testing needs in all facets of food production, including regulatory compliance and food-related outbreaks. Rapid methods are defined in the chapter as alternative microbiological testing methods that are able to provide reliable test results in a shorter time than those obtained by culture cultivation. Rapid methods offer the promise of improving the risk assessment and modeling process by increasing the amount of data and information available for use in developing risk assessments as well as reducing the uncertainty associated with the data that are used. Rapid methods for microbial testing of foods often introduce timescales and volumes for information generation that are incompatible with descriptive decision making.
To overcome the sensitivity and specificity issues, current confirmatory foodborne pathogen detection methods generally require an initial, time-consuming growth step in culture media, followed by isolation on solid media, biochemical identification, and molecular or serological conformation. Rapid-detection-based technologies can reduce the time and labor involved in screening food products for the presence of pathogens. Many of the rapid tests can be completed within 24 h, with high throughput, thereby reducing the labor involved in the testing process. These assays can be broadly grouped into three categories including immunologically based methods, nucleic acid-based assays, and biosensors. This review focuses on methods to isolate and detect pathogens in food samples. The presence of pathogens in air and the transmission of infections in air is an intriguing phenomenon, which, although subject to a never-ending debate, incidentally plays prominent epidemiological roles in husbandry and transmission of zoonotic microorganisms from the primary sources of infection, i.e., animals. The detection of microorganisms in air traditionally has been accomplished by sampling of airborne particles with subsequent analysis of the samples by a vast variety of detection methods. Principles of air sampling include solid and liquid impaction, filter-based samplers, and electrostatic absorption.
Analysis of food samples by cultural and enrichment techniques remains an integral part of the examination of food for the presence or enumeration of foodborne pathogens. Enrichment methods have afforded the ability to detect as few as one cell per 500 g of food. In addition, enrichment techniques aid in the recovery of injured bacteria, which are often present in foods in a stressed condition because some foods may lack the optimal nutrients essential for bacterial growth, the food may present an environment not conducive to growth of the microorganisms, or the bacteria may be damaged during processing of the food. In this regard, the enrichment process provides an optimal growth medium and a period of time that can be effectively used by injured/stressed microorganisms to perform repair of cellular damage, allowing for resuscitation of the bacterial cells. Additionally, enrichment processes provide an opportunity for target bacterial cells to proliferate, while the growth of competing background microflora is truncated. This chapter describes many differential and selective media, which have been developed in response to the presence of bacterial pathogens capable of causing illness in humans, by employing unique phenotypic characteristics of these microorganisms to develop detection platforms based on cultural approaches. The selective media includes enterohemorrhagic Escherichia coli (EHEC) media, Listeria media, Salmonella media, Campylobacter media, Vibrio media, and enumeration media. Several groups have investigated the use of a single broth to simultaneously enrich the concentrations of several bacterial species, followed by downstream detection.
Microbial communities in food and food production establishments have highly versatile structures based on various extrinsic factors. The information gained by large-scale microbial community analysis not only can deepen the understanding of food microbiology in general but also can lead to improvement of food production systems for increased quality and safety and extension of shelf life. This chapter reviews the most recent methods applied to microbial communities, and presents examples of some cutting-edge technologies. Molecular methods have been developing rapidly in recent years both for specific detection of single species and for screening assays that allow the species composition of a given food sample to be unraveled. Examples of those methods are 16S rRNA clone analysis, fingerprinting methods such as terminal restriction fragment length polymorphism (t-RFLP) and denaturing gradient gel electrophoresis (DGGE), tag-encoded pyrosequencing, single-nucleotide primer extension (SNuPE), and microarrays. Furthermore, flow cytometry is also addressed in the chapter, but this technique is based on single-cell analysis whereby a cell suspension is concentrated by nanofluidics and analyzed by laser technology. High-throughput analysis of microbial populations in food products and food processing environments has revealed the existence of a higher complexity in the microbial world than previously expected. The new approaches provide opportunities for further understanding of the microbial developments that are initiated during food production and storage.
As our understanding of pathogens has expanded and the world’s food supply has become more interconnected, characterization and subtyping of foodborne pathogens have become critically important in monitoring specific foods, performing surveillance for foodborne pathogens, investigating foodborne outbreaks, and understanding the virulence properties of particular lineages or strains. This chapter provides an overview of the molecular subtyping methods most commonly employed, in particular focusing on those methods providing fast and high-throughput analysis of strains. The methods include pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), multilocus variable-number tandem repeat (VNTR) analysis, DNA microarrays, and mass spectrometry. The heterogeneity that is targeted in the methods falls into three general categories. First, single-nucleotide polymorphisms (SNPs) are single-base changes in one sequence in comparison to another, related sequence. SNPs typically arise through sporadic mutation, so they represent a means of measuring random genetic drift. A large number of SNPs between two sequences may indicate a more distant genetic relationship. Second, some regions of the genome are more likely to change than others; they provide a means of differentiating closely related strains. Examples include variable number tandem repeats and highly polymorphic genes such as some surface structure and virulence genes. Finally, the presence/absence of specific loci can be found in both distantly and closely related strains. The presence of additional sequences in a strain can result from the insertion of a bacteriophage in the chromosome, acquisition of a plasmid, or other horizontal transfer events.
Consequences of poorly collected, incorrectly transported, and inadequately prepared samples include failure to recover and identify the foodborne pathogens or toxins present and the misidentification of potential hazards or sources of contamination. Investigations should also extend beyond the immediate preparation of the implicated food and investigate the processing chain environment from farm to fork including sources of ingredients, processing and storage environments, and transportation. This chapter focuses on the foodborne illness outbreak scenario, and much of it is applicable to validation procedures relevant to hazard analysis critical control points (HACCP) or to situations in which a customer complaint is received. In terms of food safety microbiology, a sample may be considered to be a portion of food, surface, or air considered representative of a larger matrix. Sample collection methods include contact plates, destructive samples (food/excision), food rinsates, cotton swabs, premoistened sponge, wipes, sterile tongue depressors, and air sampling vacuums (agar plate/filter). When examining a food processing plant, sampling should initially be based on the HACCP plan. The chapter highlights the main problem with sampling surfaces, where the swab method may only remove a fraction of the microbial flora from a carcass when compared to an excision sample. It is critical that professionals involved in public health have well planned strategies to efficiently deal with foodborne outbreak scenarios including rapid sampling techniques, sampling plans, rapid detection, and molecular subtyping protocols for the major foodborne pathogens.
This chapter describes the assessment of the microbiological safety of foods, and the assessment of microbiological quality. Even with the widespread implementation of preventative strategies such as hazard analysis and critical control point (HACCP) and related food safety management strategies, there are still many situations in which food safety assurance mainly relies on microbiological testing. End product testing might be needed in some circumstances, for example, when there are no critical control points in a process (e.g., raw or minimally processed ready-to-eat foods) or when the history of a product is unknown. Equally, food safety objectives and performance objectives may nominate microbiological criteria to be met. The chapter presents basic concepts of sampling plan nomenclature, design, and interpretation to complement the rapid microbiological detection, identification, and enumeration technologies. Additionally, it presents recently introduced concepts concerning the numerical interpretation of microbiological presence/absence testing and a discussion of sample compositing.
Sample preparation methods aim at separation of the target cells or virus particles from the surrounding food matrix. Down-stream methods, such as cell disruption, analyte purification, and detection via, for example, real-time PCR, have to be taken into consideration individually when designing or choosing a suitable preparation method. Sample treatment is defined as the preanalytical step in the method protocol, which is also necessary for reduction of the sample volume while maintaining the initial target number, as much as possible. This chapter first describes optimal performance characteristics of sample preparation. Next, the chapter talks about practical solutions to physical separation methods, biochemical and biological separation methods. Then, it discusses chemical and enzymatic digestion of the food sample matrix for preseparation. Analyte extraction is the subsequent step following sample preparation. It is necessary to obtain access to the molecules that are the target of molecular detection assays such as real-time PCR. In order to gain access to the analyte molecules, nucleic acids, or proteins, the integrity of the target cells has to be destroyed in most cases. Since analyte extraction as a part of the detection chain in food pathogen detection is similar to the application in basic research, a brief summary is given.
Microbiological food safety is achieved by strict adherence to Good Manufacturing Practices and implementation of adequate control measures in the production process to prevent the entry, survival, and/or growth of pathogens. The application of rapid methods can be a great advantage to respond quickly to outbreaks, reject contaminated raw materials before they enter the production chain, rapidly identify process failures to take corrective measures. This chapter provides an overview of criteria that can be useful to judge the suitability of a method. The criteria includes reliability, validation, acceptance by public health authorities and trade partners, time to result, costs, laboratory space, sample throughput, shelf life of reagents, competency level and workflow compatibility, and supply chain confidence. The chapter provides some general examples of the relative importance of these criteria for different applications, but it is up to the user to define which criteria are really essential and which are only desirable. Once the selection criteria are established, the next challenge is to find reliable information on how candidate methods score with respect to the requirements.
Molecular methods, including nucleic acid-based technique such as PCR technique, and antibody-based techniques such as enzyme-linked immunosorbent assay (ELISA), are alternative methods for routine analysis of foodborne pathogens. Inclusion of proper analytical controls is an absolute prerequisite for successful implementation of molecular methods for diagnostic purposes. This chapter describes the selection and inclusion of the analytical control(s), its use, interpretation, applications, and limitations, using PCR and ELISA techniques as examples. It also touches upon the inclusion of controls associated with biosensors and lab-on-a-chip. Negative and positive controls are generally included in most molecular methods and are used to verify that the assay used is able to detect the target and to rule out the risk of cross-contamination during the analysis step. The chapter describes the specific controls needed for PCR and antibody-based methods. Another important type of controls includes process controls, with the purpose of verifying that all steps in the detection process work as intended. These steps can include sample treatment and nucleic acid extraction, together with amplification and detection. There are two types of process controls: positive process control and negative process control. In addition, the positive process control can be combined with an internal amplification control (IAC) by addition of a suitable strain or DNA at the beginning of the analysis chain. Finally, the chapter discusses quantification controls used in real-time PCR and antibody-based methods.
The need for the food industry to rapidly assess the microbiological quality of raw materials and finished products and the microbiological status of manufacturing procedures has led to the development and refinement of alternative methods that are faster and easier to perform than the corresponding culture-based methods. This chapter outlines a general outlook of the validation procedures to qualitative and quantitative methods. The validation of qualitative and quantitative methods comprises two phases. Phase A, ‘‘Comparative study by an expert laboratory of the alternative method against a reference method,’’ is sometimes referred to as in-house validation by an expert laboratory. Phase B, ‘‘Collaborative study of the alternative method,’’ is carried out in a ring trial organized by the same expert laboratory from Phase A. The selection of food categories and types used within the validation will depend on the type or group of microorganisms and the scope of the validation. Validation can be carried out for a restricted number of food categories, e.g., meat products and milk and dairy products. Then, only these categories require to be studied. In addition to food categories, feed and environmental samples and samples from feces and primary production can be included.
Microbiological data on foodborne pathogens used in research for risk assessment or risk mitigation should be accurate and reliable to the specified level of the underlying measurement performance characterization. This chapter reviews standard statistical approaches to performance characterization for detection and enumeration with emphasis on validity aspects. It describes the most important statistical measures described by International Organization for Standardization (ISO 16140) for validation of qualitative and quantitative methods of accuracy used for alternative methods. The chapter also briefly describes the performance characterization with regard to precision (interlaboratory reproducibility). Issues related to the limit of detection (LOD), sometimes referred to as analytical sensitivity, are relevant for both classical and alternative methods for detection and enumeration. Data generated by using methods with an LOD include nondetects, i.e., false-negative results, and are referred to as censored data, because only values that are greater than the LOD can be reliably observed. Ignoring the censoring problem and also the substitution of nondetects with an arbitrary value selected from a range between zero and the LOD leads to biases. New technologies have emerged in response to the need for fast, cost-efficient, and high-throughput devices for the detection and classification of foodborne microorganisms. The chapter finally reviews some applications of statistical methods and provides references for further reading.
The contribution of the various foods and their categories to the occurrence of food-borne cases of human salmonellosis varies between countries depending on the prevalence of different Salmonella serovars in food animals and in their various food production chains, as well as consumption habits and food preparation practices. Most contaminated eggs have Salmonella on the shell surface only, and therefore, eggs are usually sanitized by a variety of methods and agents. Meat is another important source of food-borne salmonellosis, with poultry and pork implicated more often than beef and lamb. Methods to sequester target pathogenic bacteria from interfering food components and to concentrate them in small volumes are needed to enable the efficient application of rapid detection and identification methods. For rapid detection of Salmonella in food including meat and eggs, three basic analytical principles are applied in practice: modified traditional culture methods, immunological methods, and nucleic acid-based methods.
Foodborne bacteria transmitted to humans via contaminated pork have a major impact, especially in developed countries. Among these bacteria, Yersinia enterocolitica has been characterized by one of the highest scores of risk for pork consumers. Consumption of pork has been associated with Y. enterocolitica infections in epidemiological studies. Raw pork products have been widely investigated due to the link between Y. enterocolitica and pigs. The serological analysis method can be used for the estimation of the prevalence of Y. enterocolitica in pig herds. The method has an advantage further down in the meat chain, since infection of pigs during transport and in the lairage will not interfere with the results. Even cross-contamination during slaughtering and dressing will not affect the result. Not only blood but also muscle fluid can be used as the basis for the analysis. Real-time PCR is a powerful advancement of the basic PCR technique. A future challenge for sample preparation is to design PCR protocols that integrate DNA extraction and amplification in an automated manner. Additionally, high-throughput technologies such as DNA-microarray, mass spectrometry-PCR, and whole-genome sequencing platforms will strengthen the portfolio of molecular diagnostic tools for pathogenic Yersinia spp.
Campylobacter infection has become one of the most important zoonoses worldwide. A low prevalence of Campylobacter is generally found in beef and pork at retail, although they may still be sources of infection. Based on the high prevalence of poultry-associated infections, this chapter mainly focuses on rapid methods for detection of Campylobacter in this particular production chain, and describes the routes of transmission and sampling in the different levels as well as intervention strategies. The chapter focuses on the introduction, infection dynamics, and sampling of Campylobacter throughout the poultry production chain, from farm to consumer level. It also describes culture-based, immunological, and molecular methods for rapid detection, characterization, and enumeration for Campylobacter. Rapid methods can generally be also more sensitive and specific than culture-based methods, and other advantages can be a high possibility of automation and detection of viable but nonculturable (VBNC) cells. The strength of rapid methods lies in their ability to screen large numbers of samples, identify the negative ones, allowing resources to be focused on confirming and culturing of presumptive positive samples to produce isolates for further characterization. The choice of a rapid method will always depend on the requested information and be influenced by the relevant matrix and the expected level of contamination.
Shiga toxin-producing Escherichia coli (STEC) strains are important food-borne pathogens responsible for a number of human gastrointestinal diseases, including watery or bloody diarrhea and hemorrhagic colitis (HC). In a proportion of individuals, mainly children and the elderly, these symptoms may be complicated by neurological and renal sequelae, including hemolytic-uremic syndrome (HUS). Most outbreaks and sporadic cases of HC and HUS have been attributed to STEC strains of serotype O157:H7. However, especially in continental Europe, STEC strains belonging to serotypes O26:H11/H-, O91:H21/H-, O103:H2, O111:H-, O113:H21, O121:H19, O128:H2/H, and O145:H28/H- are increasingly reported as causes of HC and HUS. In general, these STEC strains are termed non-O157. Clinical laboratories historically screened only for serogroup O157, leading to possible underreporting of non-O157 STEC-associated diseases. The main virulence factor of STEC is the production of Shiga toxin 1 (Stx1) and/or Stx2 or its variants. Since the minimal infectious dose of STEC is very low, usually qualitative procedures with enrichment steps are established. Due to significant differences in methods for detection of STEC O157 and non-O157, these two groups are discussed separately. Cultural methods for detection of STEC O157:H7 in food samples basically comprise a combination of an enrichment step and a serogroup-specific concentration with plating on selective agar and/or chromogenic solid media, followed by confirmation of presumptive positive colonies by biochemical and serological testing as well as by molecular methods. The detection methods of non-O157 STEC include culture methods, enzyme-linked immunosorbent assay (ELISA), and PCR.
Food safety and a rapidly changing global market for animal feed and feed ingredients increase the need and urgency for development of sensitive, accurate, precise, fast, and economically feasible testing methods for multiple mycotoxins and Salmonella. Rapid methods are entering the market and being developed at an astounding rate, as evidenced by the amount of recent review information. Traditional methods are still widely used, but kits for rapid detection are gaining validation status by standardizing organizations. There are still many issues to resolve including very significant challenges related to representative sample collection to ensure no false negatives occur. Sampling plans will need to be globally harmonized for consistency but must be balanced with equity concerns for economic impact and food security in the sense of an adequate and safe food supply for developing countries. Sample processing and analysis still present limitations for detection of multiple mycotoxins, masked mycotoxins, and metabolically injured (viable but not culturable) Salmonella.
The realization that listeriosis was a foodborne disease was largely due to the involvement of Listeria monocytogenes-contaminated dairy products in the earliest recognized foodborne outbreaks. Thus, dairy sample matrices are routinely included in developing and validating rapid methods for L. monocytogenes in all food categories. Dairy samples represent a full range of physical characteristics from fluid (milk) through semisolid (yogurt) to soft solid (butter, cream, some cheeses) and hard solid (other cheeses). Environmental sampling can be relatively simple physically but can possibly be complicated by the degree of attachment to surfaces and sequestration in biofilm matrices. Sampling of the production and processing environment can be a useful tool to identify and prevent the presence of L. monocytogenes in dairy products and also for validating and verifying the correct functioning of their procedures based on hazard analysis and critical control point (HACCP) principles and good hygiene practice. A proper sample preparation is critical for rapid methods of listeria detection and enumeration. This chapter presents guidelines for the various general types of dairy samples. Rapid methods for detection of L. monocytogenes can be interpreted as a reduction in the number of days to obtain a result, but the process may still take a relatively long time, i.e., days rather than hours. These rapid methods include (i) agar-based methods including chromogenic agars, (ii) immunoassay-based methods, (iii) concentration methods, and (iv) molecular methods.
Bacillus cereus is a gram-positive, spore-forming, motile, aerobic rod that grows well anaerobically. It is a common soil saprophyte and is easily spread to many types of foods, especially of plant origin, but is frequently isolated from dairy products, meat, and eggs. Members of the B. cereus group are a special problem for the dairy industry and are frequently found in pasteurized milk and milk-derived products, such as milk powder and infant formulas. Beside a general overview on novel methods for detection and enumeration of B. cereus, this chapter focuses on recently developed methods for the detection and quantification of the B. cereus toxins. It briefly discusses novel methods to determine the toxigenic potential of a strain and present the current methods for subtyping of B. cereus group organisms. It also provides an overview of the different possibilities for detection and enumeration of B. cereus in food samples. Several DNA extraction methods, including commercially available DNA isolation kits, have been tested for their suitability for B. cereus DNA extraction from milk and milk products as well as from other foodstuffs during the development of molecular detection systems for emetic B. cereus. An overview of the detection limits for DNA prepared by different extraction methods and subjected to standard PCR and real-time PCR is given.
Staphylococcus aureus is an important pathogen causing a variety of diseases. Some strains are able to produce staphylococcal enterotoxins (SEs) causing food intoxications. Two different aspects are of importance concerning the presence of S. aureus in the dairy chain: (i) clinical and subclinical infections of cows, not only posing a health risk via shedding of S. aureus in their milk but also being an economic factor in dairy farming; and (ii) transmission into the dairy chain and production of enterotoxins leading to human food poisoning. There are five major classical SE types causing food poisoning, i.e., SEA, SEB, SEC, SED, and SEE, and several newly identified SEs. Rapid detection and quantification methods could serve as valuable tools for early indication of process failures, and as part of hazard analysis and critical control points concepts. The trend in S. aureus enterotoxin detection and quantification is enhanced by the desire to reduce the time from sampling to identification and confirmation in the laboratory. These commercially available rapid methods (e.g., classical enzyme-linked immunosorbent assay (ELISA)) can detect approximately 1.0 nanogram of toxin/g of food. A further recent development is a real-time immunoquantitative PCR method for the detection of SE. For further characterization of enterotoxigenic S. aureus, a number of techniques have been developed lately, including, for example, DNA microarrays and subtyping methods. Characterization and subtyping methods facilitate tracking and tracing the contamination and transmission routes of S. aureus in the dairy chain.
Cronobacter species are occasional contaminants of powdered infant formula (PIF), and consequently, this particular food type presents a significant health risk to vulnerable neonates. Microbiological criteria are designed to control the concentration and prevalence of bacteria in foods. These criteria specify the microorganism of concern, the analytical method for their detection/quantification, a sampling plan defining the number of samples to be taken and the size of the analytical unit, the microbiological limits deemed appropriate at a specified point in the food chain, and the number of analytical units that should conform to those limits. Generally, rapid methods are understood to mean the application of a molecular or instrument-based technique to enable pathogen detection in hours rather than days. In common with Listeria monocytogenes, rapid detection of Cronobacter species can be interpreted to mean improving conventional plating methods, with or without molecular methods for confirmation. Molecular subtyping methods have been used to fingerprint clinical Enterobacteriaceae isolates associated with nosocomial infections and foodborne outbreaks. These protocols contribute towards extending our understanding of microbial ecology and epidemiology, and they are regarded as useful tools to monitor foodborne disease.
The reality today is that fresh produce is a food item that is viewed as highly vulnerable to pathogen contamination. Fresh produce, by virtue of its cultivation, handling, and consumption practices, is prone to pathogen contamination and therefore becomes a vehicle for widespread foodborne outbreaks. This chapter highlights the challenges associated with monitoring for pathogens in fresh produce. For both food- and waterborne pathogens, food hygiene guidelines go hand in hand with agricultural water guidelines. Probably, the single most important source of contamination for fresh produce is water, either irrigation water or water used during postharvest processing. The chapter discusses some of the contemporary approaches of how certain regions in the United States are attempting to reduce the potential for outbreaks from fresh produce. It includes a description of the key pathogens that have been associated with produce-related outbreaks and provides an overview of some of the indicator organisms that can be used to screen for the presence of fecal contamination.
Viruses such as hepatitis A virus (HAV), noroviruses (NoV), sapoviruses, enteroviruses, astroviruses, adenoviruses, rotaviruses, and hepatitis E virus have all been implicated in food- and/or water-borne outbreaks of illness. This chapter deals with NoV and HAV detection in bivalve mollusks, soft fruits, and water. For bivalve mollusks, fecal indicators are measured either in the shellfish themselves or in their growing waters. When virus detection procedures are mentioned, the recurrent issue of detecting infectious or physical particles comes into discussion. Whenever possible, infectious assays coupled with identification methods are preferred for direct assessment of human health risk. Nucleic acid amplification techniques are currently the most widely used methods for detection of viruses in food and water and also enable investigators to gather information on the virus genotypes occurring in the environment and in food products, thus providing the most relevant epidemiological information, particularly important for the implementation and follow-up of vaccination programs in the human population.
The foodborne protozoan parasites of particular concern with respect to public health include Cryptosporidium spp., Giardia duodenalis, Cyclospora cayetanensis, and Toxoplasma gondii. This chapter outlines the general biology, mechanisms of transmission, and prevalence of infection of Cryptosporidium spp., G. duodenalis, C. cayetanensis, and T. gondii, and focuses on their prevalence in foods, reported foodborne outbreaks, and the methods available for their detection, characterization, and control.
Food service can be defined as those entities responsible for meals prepared outside the home. Included in this definition are restaurants, schools, catering companies, hospital cafeterias, and commercial or institutional kitchens operating within other venues. Food service distributors typically supply foods in bulk to these entities. Consequently, these institutions have a need to understand and implement appropriate microbiological specifications, such as sampling plans, rapid testing, and interpretation of data. In food service operations as well as the rest of the food industry, the question, ‘‘How many samples should I require be taken and tested?’’ is an important question that is frequently asked and is particularly important in the context of establishing supplier microbiological criteria. Related to that question are deeper considerations and questions including the following: (i) Is the target organism likely to be evenly distributed in my food sample? and (ii) if the organism is evenly distributed, does that mean one sample is enough? Another critical concern relates to interpretation of results, particularly as they relate to retesting, and addresses the questions, ‘‘Is retesting an appropriate means to verify the accuracy of my test data? And, if so, when is it appropriate?’’ All these issues are addressed in this chapter.
Clostridium perfringens toxins are responsible for a variety of human and veterinary diseases. C. perfringens isolates are commonly classified into one of five types, A to E, based on their ability to produce four so-called lethal toxins, alpha, beta, epsilon, and iota. In addition to the enterotoxigenic nature of certain isolates, the two defining characteristics of C. perfringens’s role in foodborne illness are its potential growth rate and its ability to produce heat-resistant spores. Spore formation by C. perfringens is an important characteristic because (i) spores are able to survive cooking procedures while eliminating competitors, (ii) it is an essential classification tool, and (iii) enterotoxin formation is associated with sporulation. This chapter talks about in situ methods for cpe-positive C. perfringens, detection of enterotoxin, molecular methods, and location of the enterotoxin gene in C. perfringens isolates. The application of molecular techniques has had a tremendous impact on the diagnosis and investigation of C. perfringens food poisoning and nonfood-borne outbreaks. Reference laboratories are now able to determine rapidly and easily if isolates have the potential to cause diarrheal disease and through the use of discriminatory molecular typing methods provide evidence that cpe-positive strains from humans and foods have a common origin.
Hepatitis A virus (HAV) outbreaks linked to poor personal hygiene of infected food handlers have been reported in association with highly handled foods such as salads, sandwiches, and bakery products. This chapter lists some of the outbreaks associated with HAV and ready-to-eat (RTE) foods. Typically, detection of viruses in foods requires sequential steps of (i) sampling, (ii) virus concentration, (iii) detection, and (iv) confirmation. Following concentration and purification of HAV from food matrices, detection of the virus can be carried out using the following assays: (i) mammalian cell culture infectivity assays; (ii) immunological methods; or (iii) nucleic acid-based molecular methods. Confirmation methods are needed during the testing of food and environmental samples in order to eliminate false positives, because nonspecific products can be amplified from the food matrix components themselves. Confirmation methods include (i) direct sequencing of amplified PCR product, (ii) restriction digestion of the amplified product, (iii) Southern hybridizations, and (iv) nested PCR. These are briefly described since they are now rarely used. Real-time approaches are now being increasingly used instead.
Proper sample preparation is a precondition for advanced laboratory diagnostics. The newer methods that claim less time to reach a result have a major limitation with how much volume they can take prior to the rapid analysis for the target molecule. Among these approaches, the use of signal amplification techniques has received attention. Another key issue is automation, where the key drivers are miniaturization and multiple testing. Quantification by real-time PCR is an established technique. Recent developments in real-time PCR have made it possible to carry out high-throughput source tracing for routine purposes in the food industry. The genomics technologies have reached a stage where they are used on a routine basis in many laboratories. These technologies are expected to move towards culture-independent detection and characterization techniques based on the purification of total DNA from diagnostic samples and subsequent metagenomic analysis. The emphasis of future testing developments should be on the use of metagenomics (gene-based) rather than the phenotypic methods used today. The food safety risk analysis tool as first described by the Food and Agriculture Organization/World Health Organization states that it should be the role of the official bodies to use risk analysis to determine realistic and achievable risk levels of foodborne hazards. However, by moving towards online testing, it may also be appropriate to change the risk assessment concept towards online product risk assessments based on the microbiological testing.
Full text loading...