The hidden truth
the Real Global Multiple Pandemics Cover up...
From Farms to Forks
From Latrines to Vaccines!
The COVID Deception 2020
Caused by Government, Big Pharma, Biopharmas, Healthcare Providers and the Industrial and Agricultural Revolution negligence in creating Coronaviruses
from Farms to Fork:
SARS, H1N1, MERS, MRSA, AMR, C-DIFFICILE, etc.
An added bacterial bonus excuse, to enforce mandatory vaccinations.
Grand Challenges Canada Government funding Bill Gates
~ Latrines to Vaccines~
Uncovering the truth
Health Scare by Health care...
Questions that demand answers
There are many underlying effects from the expansion of the industrial revolution since the 1990’s that have created negative results within the human body mainly beginning with pharmaceutical drugs that have trickled into our Animal food processing methodologies and vegetable growing.
The effects have been spread worldwide thus creating a costly health care pandemic which helps the pharmaceuticals and private healthcare more profitable at the detriment of the human population.
Connecting the dots has never been more important in understanding the new novel pandemics claimed by our governments in order to control humanity under a global, medical martial law and to forcibly instill their one world government and governance at humanity’s expense.
No greater evil has ever been committed to this day of such magnitude, that we now, have to enter a war created by our governments against humanity.
More on this coming soon!
Canadian Government Hidden Agenda:
Cover up of Healthcare negligence, repurposed to create a false pandemic, New World Order, Global Governance scheme.
- LUNG DISEASES: Pollution – Cloud seeding – Weather modification
- C-DIFFICILE: Hospital borne diseases
- CORONAVIRUSES: Animal and Human antibiotic resistant contaminated feces used in agriculture.
Questions That Demand Answers!
Canadians need to know the truth...
1) As done in the USA, how much were hospitals and doctors paid to claim COVID-19 related deaths?
The government will pay more to hospitals for COVID-19 cases in two senses: By paying an additional 20% on top of traditional Medicare rates for COVID-19 patients during the public health emergency, and by reimbursing hospitals for treating the uninsured patients with the disease (at that enhanced Medicare rate).
Both of those provisions stem from the Coronavirus Aid, Relief, and Economic Security Act, or CARES Act.
The CARES Act created the 20% add-on to be paid for Medicare patients with COVID-19. The act further created a $100 billion fund that is being used to financially assist hospitals — a “portion” of which will be “used to reimburse healthcare providers, at Medicare rates, for COVID-related treatment of the uninsured,” according to the U.S. Department of Health and Human Services.
2) Given antibiotic resistance diseases, will doctors also automatically prescribe probiotics at the expense of the pharmaceuticals to patients as a preventive measure?
3) Given the WHO's new antibiotic resistance and antimicrobial resistance list of Diseases, will the Canadian Federal and Provincial Government finally demand more accurate testing of human and animal feces in agriculture and municipal composting?
Health Care-Associated Infections
Health care-associated infections (HAIs) are infections people get while they’re receiving health care for another condition. HAIs can happen in any health care facility, including hospitals, ambulatory surgical centers, end-stage renal disease facilities, and long-term care facilities. Bacteria, fungi, viruses, or other, less common pathogens can cause HAIs.
HAIs are a significant cause of illness and death — and they can have serious emotional, financial, and medical consequences. At any given time, about 1 in 25 inpatients have an infection related to hospital care. These infections lead to tens of thousands of deaths and cost the U.S. health care system billions of dollars each year.
The U.S. Department of Health and Human Services (HHS) has identified the reduction of HAIs as an Agency Priority Goal. HHS is committed to reducing the national rate of HAIs.
Connecting the hidden dots about what they are afraid to talk about
Global Pharmaceutical, health care and Governance incompetence...
take a closer look
Possible Relationship between COVID-19, Air Quality and Meteorological Factors?
What if everything is interrelated
too COVID-19 claimed symptoms?
If so, shouldn't the Government
investigate effects on health regularly?
NO2 is a good indicator of vehicle emissions (Ismail et al., 2019). The traffic density substantially decreased in Kuala Lumpur during the MCO, hence improving the air quality, including NO2. Earlier studies have shown that prolonged exposure to NO2 causes a reduction in lung function (van Zoest et al., 2020) and lung inflammation (Jalaludin et al., 2014; Kamaruddin et al., 2019), which ultimately lead to an activation of the immune system. Therefore, there is a possibility that those infected by COVID-19 have lower immunity to defend themselves from viral infections, especially when they encounter a chronic respiratory illness due to an initial poisoning in their bodies.
Our findings on significant positive correlations between air pollutants (PM10, PM2.5, SO2, NO2 and CO) and newly confirmed COVID-19 cases were comparable to the previous studies in China, Italy, and the USA (Pansini and Fornacca, 2020; Setti et al., 2020; Zhu et al., 2020). Zhu et al. (2020) found significant positive associations of PM10, PM2.5, NO2 and CO, with COVID-19 confirmed cases via their generalized additive model. Although we found no significant correlation between O3 and new COVID-19 cases, Zhu et al. (2020) found a positive correlation between O3 and new COVID-19 cases. As for SO2, we found its positive correlation with new COVID-19 cases, but Zhu et al. (2020) reported a contrasting finding of negative correlation with daily COVID-19 cases.
Pansini and Fornacca (2020) compiled air quality information from China, Italy, and the USA. They found significant positive correlations between COVID-19 cases and air quality variables in each country, thus providing preliminary evidence that COVID-19 cases are most often found in highly polluted areas of these countries. They reported that PM2.5, CO and NO2 were positively correlated with COVID-19 cases, which were in line with our findings. Unlike our non-significant positive correlation between O3 and COVID-19 cases, they presented a relatively strong positive correlation between O3 and COVID-19 in Italy, although China and the USA displayed a negative correlation.
Setti et al. (2020) reported that PM10 pollution could be the cause of the accelerated spread of COVID-19 infection in several regions of Northern Italy; they proposed that particulate matter was used as a carrier for the coronavirus. There was a direct relationship between the incidence of COVID-19 and the PM10 concentration, especially in the areas with high PM10 concentration in Northern Italy. Their findings agreed with our findings on the positive correlation between PM10 and COVID-19 cases. Before proceeding to meteorological factors, all these positive connections between air pollutant concentrations (PM10, PM2.5, SO2, NO2 and CO) and COVID-19 cases discussed here also support the theory that fewer intra-city and inter-city movement after the MCO in Malaysia contributes to reducing the infection rate in the country. Moreover, it is evident from these findings that exposure to air pollution over time influences the incidence of COVID-19 cases.
What follows is an account of meteorological factors. The influence of temperature and humidity on incidence of COVID-19 may be credible in countries with a low percentage of imported cases because there were high occurrences of SARS-CoV-2 being transmitted in the community. Malaysia is such a country with higher cases from local community transmissions with 7287 (93.9%) cumulative local cases compared to 475 (6.1%) imported cases as of 30 May 2020 (Ministry of Health Malaysia, 2020a). SARS-CoV-2 is predicted to survive better in drier air, which means lower humidity environment. However, we observed a weak positive correlation between RH and new COVID-19 cases (r = 0.106, p = 0.001), which were in line with a review that pointed out that immune response to viruses is less effective in surroundings with lower humidity (Moriyama et al., 2020). In our study, RH ranged from 64.26% to 82.08% from 1 January–21 April 2020, which was due to the interchange between hot and monsoon season in the country. Our findings demonstrated that high outdoor humidity could promote viral spread; hence, the virus can survive for longer periods in tropical countries like Malaysia, when airborne droplets that contain the virus fall on indoor surfaces. In contrast to our results, an analysis from 166 countries found that an increase of 1% RH markedly lowers the transmission of viruses with 0.85% reduction in daily new cases of COVID-19 (95% CI: 0.51%, 1.19%), and 0.51% reduction in daily new deaths (95% CI: 0.34%, 0.67%) (Wu et al., 2020). Their findings would be probably applicable to countries in winter climates such as Europe and the USA, but not for tropical countries like Malaysia, and countries in summer climates like Australia.
In the case of temperature, we observed a weak negative correlation between AT and daily new cases of COVID-19 (r = -0.118, p < 0.001), which were consistent with many findings that showed high temperature significantly reduced the transmission of SARS-CoV-2 (Şahin, 2020; Wang et al., 2020; Wu et al., 2020). In our study, AT ranged from 27.92°C to 30.08°C from 1 January–21 April 2020, which is typical for a tropical country. Wu et al. (2020) testified that an increase of 1°C of temperature substantially lowers the transmission of SARS-CoV-2. A 1°C increase in temperature had significantly decreased the daily new COVID-19 cases by 3.08% (95% CI: 1.53%, 4.63%) and decreased the daily new deaths by 1.19% (95% CI: 0.44%, 1.95%) reduction in daily new deaths. Likewise, Wang et al. (2020) reported that the cities in northern China, where temperatures were lower than the cities along the country’s southeast coast, had higher virus transmission rates. The temperature had quite a strong effect on R-value with significance levels of 1% for all specifications, and an increase of 1°C in temperature lowers the R-value by 0.0225. On the other hand, researchers from Indonesia, a country with almost similar meteorological factors like Malaysia, presented a contrasting finding (Tosepu et al., 2020). They reported that average temperature had a significant positive correlation with COVID-19 cases (r = 0.392, p < 0.01). With this distinct finding from Indonesia, SARS-CoV-2 could, in fact, survive in this high temperature and continue to infect hosts.
Apart from RH and AT, SR is also predicted to affect the rate of SARS-CoV-2 survival. Nevertheless, we found no significant correlation between SR and newly confirmed COVID-19 cases (r = –0.017, p > 0.01), which was in line with findings in China (Yao et al., 2020). They did not support the hypothesis that sunlight can decrease the transmission of COVID-19, although the ultraviolet rays from the sun were acknowledged to help destroy common flu and cold viruses. They conducted a study between early January and early March in all the endemic cities across China and revealed that SARS-CoV-2 transmission was not affected by the variations in temperature or humidity. By contrast, a study in the USA demonstrated that lower ultraviolet rays promote the growth of SARS-CoV-2 (Merow and Urban, 2020).
Another meteorological factor that is important in the transmission of SARS-CoV-2 is WS because it can cause the dispersion of suspending particles that might carry the viruses in the air. A study in Iran proved that the COVID-19 cases were higher in provinces with low wind speed compared to provinces with high wind speed (Ahmadi et al., 2020). However, our finding on the correlation between WS and newly confirmed COVID-19 cases was not significant (r = –0.059, p > 0.01).
Assessing the direct occupational and public health impacts of solar radiation management with stratospheric aerosols.
Geoengineering is the deliberate large-scale manipulation of environmental processes that affects the Earth’s climate, in an attempt to counteract the effects of climate change. Injecting sulfate aerosol precursors and designed nanoparticles into the stratosphere to (i.e., solar radiation management [SRM]), has been suggested as one approach to geoengineering. Although much is being done to unravel the scientific and technical challenges around geoengineering, there have been few efforts to characterize the potential human health impacts of geoengineering, particularly with regards to SRM approaches involving stratospheric aerosols. This paper explores this information gap. Using available evidence, we describe the potential direct occupational and public health impacts of exposures to aerosols likely to be used for SRM, including environmental sulfates, black carbon, metallic aluminum, and aluminum oxide aerosols. We speculate on possible health impacts of exposure to one promising SRM material, barium titanate, using knowledge of similar nanomaterials. We also explore current regulatory efforts to minimize exposure to these toxicants. Our analysis suggests that adverse public health impacts may reasonably be expected from SRM via deployment of stratospheric aerosols. Little is known about the toxicity of some likely candidate aerosols, and there is no consensus regarding acceptable levels for public exposure to these materials. There is also little infrastructure in place to evaluate potential public health impacts in the event that stratospheric aerosols are deployed for solar radiation management. We offer several recommendations intended to help characterize the potential occupation and public health impacts of SRM, and suggest that a comprehensive risk assessment effort is needed before this approach to geoengineering receives further consideration.
Warming of the climate system is unequivocal, and since the 1950s, human influence on the climate system has become clear [1, 2]. Because human activities have become significant geological forces, the term “anthropocene” has been applied to the current geological epoch, which began in the eighteenth century . The United Nation’s Intergovernmental Panel on Climate Change (IPCC) has forecast that if human activity and world development continue unimpeded, average surface temperatures could rise as much as 4.8 °C by 2100 [1, 2, 4]. The lack of success to date in efforts to reduce greenhouse gas emissions sufficiently has prompted attention to the possibility of counteracting the effects of emissions through the intentional manipulation of global-scale Earth system processes – a process referred to as “geoengineering” 
The concept of geoengineering is not new, and dates back to at least 1965 . However, the term geoengineering as applied in its current context was introduced in 1977 . Geoengineering approaches include solar radiation management, or SRM, and carbon dioxide removal (CDR) . SRM techniques attempt to offset effects of increased greenhouse gas concentrations by reducing the proportion of incoming short wavelength solar radiation that is absorbed or reflected by the earth’s atmosphere (Fig. 1) . Proposed SRM techniques include stratospheric aerosols, reflective satellites, whitening of the clouds, whitening of built structures and increasing plant reflectivity (Fig. 2) . All SRM deployment techniques require a global approach since localized deployment will not produce sufficient effects. Importantly, SRM approaches to managing climate change require initial and ongoing addition of aerosols to the atmosphere, with increasingly greater additions as emissions of GHGs rise, given the risk of sudden and potentially catastrophic warming if aerosol levels are not maintained. Proposed CDR approaches include afforestation/reforestation, direct air carbon dioxide (CO2) capture/storage, manufacturing carbonate minerals using silicate rocks and CO2 from the air, accelerated weathering of rocks, ocean alkalinity addition and ocean fertilization (Fig. 2) .
This paper will focus on SRM via stratospheric aerosol injection, and will describe potential direct human health impacts. We explore three knowledge gaps: 1) human exposures, 2) human health impacts, and 3) exposure limits. SRM may be expected to result in ecosystem damage and resulting human health effects through indirect mechanisms such as damage to, or contamination of, agricultural products and wildlife. While these effects are important, they are beyond the scope of our paper.
Stratospheric aerosols for use in SRM
The stratosphere is the second major layer of Earth’s atmosphere, lying immediately above the lowest layer (the troposphere) at an altitude of 10–50 km . Within the stratosphere temperatures increase with increasing elevation. The potential for SRM from stratospheric injection of aerosols has been demonstrated by global cooling following large volcanic eruptions .
A wide range of particles could be released into the stratosphere to achieve the SRM objective of scattering sunlight back to space. Sulfates and nanoparticles currently favored for SRM include sulfur dioxide, hydrogen sulfide, carbonyl sulfide, black carbon, and specially engineered discs composed of metallic aluminum, aluminum oxide and barium titanate . In particular, engineered nanoparticles are considered very promising. The particles would utilize photophoretic and electromagnetic forces to self-levitate above the stratosphere . These nanoparticles would remain suspended longer than sulfate particles, would not interfere with stratospheric chemistry, and would not produce acid rain . However, while promising, the self-levitating nanodisc has not been tested to verify efficacy, may increase ocean acidification due to atmospheric CO2 entrapment, has uncharacterized human health and environmental impacts, and may be prohibitively expensive .
Knowledge gap 1: human exposures
Human exposures to materials used for SRM could occur during the manufacture, transportation, deployment and post-deployment of these materials . In this paper, unless otherwise stated, inhalation is the primary route of exposure considered.
Airborne sulfate exposures have been shown to range up to 23 mg/m3 in sulfuric acid plants . Additionally, high exposures to sulfuric acid fumes have also been noted in the petrochemical industry, and high exposures to hydrogen sulfide and carbonyl sulfide have also been noted in natural gas extraction operations [15, 16]. Exposures to black carbon during its manufacture can be quite high . Elevated airborne exposures to aluminum and its oxide have been shown to occur during aluminum refining, smelting and at aluminum powder plants . There appears to be no available documentation of occupational exposure to barium titanate. In addition to manufacturing settings, exposures to SRM materials could occur during deployment, e.g., during cloud seeding operations, as well as from accidents during transportation [19, 20].
Occupational exposures to SRM materials are likely to occur over brief periods (e.g., days to weeks), with the potential for repeated or cyclic exposures. The health effects of such exposures will therefore likely be acute in nature, though repeated exposures create an opportunity for chronic health effects. Occupational exposures may be attenuated through the use of engineering controls such as ventilation, as well as the use of personal protective equipment (PPE) such as respirators and protective suits.
Due to atmospheric circulation and gravitational deposition, large-scale population exposures to atmospherically-injected SRM materials will almost certainly occur after their deployment. Population exposures could also occur through ingestion of food and water contaminated with deposited particles, as well as transdermally [11, 21]. Unlike occupational exposures, there has been virtually no research done to estimate ground-level personal exposures to SRM materials, though the US Environmental Protection Agency (EPA) does provide guidance on methods for evaluating environmental exposures to several possible SRM materials .
Stratospheric injection of sulfur dioxide and black carbon has already been modeled to analyze potential deposition of sulfate and soot [21, 23]. One model estimated that with 1 Tg of black carbon infused into the stratosphere annually, after ten years of geoengineering, the globally averaged mass burden would be approximately 8 × 10−6 kg m−2 . The intentional addition of black carbon to the atmosphere will exacerbate adverse health effects already resulting from unintentional release at ground level . In the year 2000, the global emission of black carbon was estimated at 7.6 Tg, and the globally averaged mass burden of black carbon was roughly 1.5 × 10−5 kg m−2 . No models appear to have estimated the potential global burden of environmental aluminum, alumina or barium titanate that might result from SRM.
In contrast to occupational exposures, population exposures to SRM materials will be continuous and prolonged over months to years, but will likely be orders of magnitude lower than those experienced occupationally. Thus the health effects will be primarily chronic in nature. The use of PPE to reduce personal exposures to deposited SRM materials is not feasible on a population scale.
Knowledge gap 2: potential human health impacts
Table 1 summarizes, by bodily system, the potential human health effects of the aerosols that may be used for SRM.
|Potential SRM aerosol|
|Health effect/target system||Sulfuric acid||Sulfur dioxide||Hydrogen sulfide||Carbonyl sulfide||Black carbon||Aluminum compounds||Barium compounds|
X Data suggest health hazard possible, – insufficient data available
Inhalational studies with sulfuric acid aerosol suggest that it has a local irritant effect and no systemic effects . Squamous cell metaplasia in the laryngeal epithelium has been observed in animal studies at exposures as low as 0.3 mg/m3, with more severe metaplasia following exposures of 1.38 mg/m3. Epidemiological studies suggest a relationship between exposure to mists containing sulfuric acid and an increased incidence of laryngeal cancer, and the International Agency for Research on Cancer has concluded that “occupational exposure to strong inorganic mists containing sulfuric acid is carcinogenic for humans” [27, 28].
In humans, and in particular asthmatics, increases in specific airway resistance or decreases in forced expiratory volume or forced expiratory flow are the primary response following acute exposure to sulfur dioxide . Cough, irritation, increased salivation, and erythema of the trachea and main bronchi occurred following controlled exposures to ≤8 ppm for 20 min . Exposures to higher levels (e.g., 40 ppm) can produce a burning sensation in the nose and throat, dyspnea, and severe airway obstruction that may only partially reverse over time . Exposures to even higher levels (e.g., ≤100 ppm) can result in reactive airway dysfunction syndrome, which involves bronchial epithelial damage and increased sensitization and nonspecific hypersensitivity to other irritant stimuli [32, 33]. Deaths can occur following exposures >100 ppm .
Single exposures to hydrogen sulfide can cause health effects in many systems . Hydrogen sulfide has an odor threshold of 0.01 mg/m3, and humans become insensitive to its odor at concentrations of ≥140 mg/m3 [35, 36]. Respiratory symptoms in asthmatic individuals appear at about 2.8 mg/m3, but respiratory distress does not seem to occur <560 mg/m3 . Eye irritation can occur at 5–29 mg/m3, and metabolic abnormalities may occur at 7 mg/m3 . Neurological symptoms such as fatigue, loss of appetite, headache, irritability, poor memory and dizziness may result following exposures >28 mg/m3 , with death occuring. > 700 mg/m3 .
Limited information is available on the pharmacokinetics of carbonyl sulfide, which likely metabolizes to carbon dioxide and hydrogen sulfide . Acute exposures result in symptoms similar to those of hydrogen sulfide, but with less local irritation or olfactory warning . Sublethal exposure can result in profuse salivation, headache, vertigo, amnesia, confusion, nausea, vomiting, diarrhea, cardiac arrhythmia, weakness, muscle cramps, and unconsciousness . Concentrations >1000 ppm can cause sudden collapse, convulsions, and death from respiratory paralysis.
Respiratory effects in black carbon workers include cough, sputum production, bronchitis, pneumoconiosis, and decrements in lung function, as well as tiredness, chest pain, headache, and respiratory irritation [24, 44, 45]. Black carbon may cause discoloration of eyelids and conjunctivae , and is possibly carcinogenic to humans (Group 2B); there is inadequate evidence of carcinogenicity in humans, but sufficient evidence in experimental animals .
Aluminum is never found free in nature, and instead forms metal compounds, complexes, or chelates including aluminum oxide . Aluminum and aluminum oxide do not appear to differ in toxicity . Wheezing, dyspnea, and impaired lung function, as well as pulmonary fibrosis, have been noted in workers exposed to fine aluminum dust [48–50]. Dilation and hypertrophy of the right side of the heart have been seen in workers exposed to aluminum powder, as have decreased red blood cell hemoglobin and finger clubbing . Helper T-lymphocyte alveolitis and blastic transformation of peripheral blood lymphocytes in the presence of soluble aluminum compounds in vitro were found in an individual exposed to aluminum dust . There is limited evidence of carcinogenicity among workers; the few existing studies have been confounded by concurrent exposures to known carcinogens, (e.g., tobacco smoke or polycyclic aromatic hydrocarbons) .
Barium titanate is a complex salt containing two metals, which complicates modeling of its toxicological properties. In general, exposures to barium salts are associated with respiratory, cardiovascular, gastrointestinal, musculoskeletal, metabolic and neurologic effects . Barium salts also have a local effect on skin surfaces and would not likely be absorbed systematically to any great extent, though this might not be true of barium salt nanoparticles [53, 54]. Barium titanate could also behave like a titanium salt in interactions with the human body, in which case the resulting health effects are essentially unknown. Only two titanium-containing compounds are indexed by the U.S Agency for Toxic Substances and Disease Registry (ATSDR) or covered by U.S exposure limits . It is possible that barium titanate might act both as a salt of barium and titanium, or as neither; the toxicological properties of a nanoparticle are influenced by factors such as particle size, surface area, chemistry or reactivity, solubility, and shape .
Knowledge gap 3: exposure standards and guidelines
Several US agencies and organizations have established occupational exposure limits (OELs) for sulfate, carbon, and some metallic substances. While OELs almost uniformly assume an 8-h daily exposure period, organizations use different assumptions and acceptable excess risk levels when establishing limits. As a result there are a range of OELs for potential SRM materials, which complicates the establishment of “safe” global levels. Additionally, some potential SRM compounds (for example, barium titanate) are currently unregulated and/or have no recognized occupational exposure assessment procedures. All of these issues apply equally to community exposure limits.
The American Conference of Governmental Industrial Hygienists (ACGIH) Threshold Limit Values (TLVs) for the potential SRM materials shown in Table 2 are consistently lower than those required by the U.S Occupational Safety and Health Administration (OSHA) or recommended by the U.S National Institute for Occupational Safety and Health (NIOSH) [56, 57] The TLVs and NIOSH Recommended Exposure Limits (RELs) are intended to protect the typical worker from any adverse health effects without consideration of economic or political feasibility, while the OSHA limits consider technical and economic feasibility and are subsequently less protective [56, 58].
|Substance||U.S Occupational Safety and Health Administration (mg/m3)a||U.S National Institute for Occupational Safety and Health (mg/m3)a||American Conference of Governmental Industrial Hygienists (mg/m3)a|
a. Computed from standards specified in parts per million
b. Short-term exposure limit (15 minutes)
c. Ceiling limit
d. 10-minute single period exposure limit
e. Respirable fraction
For public exposures – which would likely be widespread following SRM efforts – the EPA, European Environmental Agency (EEA), and World Health Organization specify regulatory standards for ambient air quality (Table 3) [57–59]. Importantly, Table 3 shows a very small sampling of air quality standards in use around the world that relate to potential SRM materials, of which the WHO standards may be considered most generalizable globally. Exposure limits differ substantially between these agencies, but, more importantly, there are currently no limits set by any of these agencies for most of the substances that may be used for SRM [60, 61].
|Substance||U.S Environmental Protection Agency||European Environmental Agency||World Health Organization|
|Limit (μg/m3)||Averaging period||Limit (μg/m3)||Averaging period||Limit (μg/m3)||Averaging period|
|Sulfur dioxide||196.5||1 h||350||1 h||20||24 h|
|125||24 h||500||10 min|
|PM2.5||12||1 year||25||1 year||10||1 year|
|35||24 h||—||—||25||24 h|
The inconsistencies in established exposure limits for both occupational and community settings, combined with the absence of any exposure limits for a number of potential SRM materials, highlight the issues involved in protecting workers and the public from unintended health consequences resulting from SRM deployment. Since employers have legal control over exposures to their workers, OELs can be met through implementation of engineering controls and use of PPE, whereas use of PPE is not feasible at a population level, and reductions in public exposures would have to rely on engineering controls (e.g., use of air cleaning devices) or administrative controls (e.g., behavior changes). The substantial potential exposures and subsequent health impacts associated with SRM efforts based on stratospheric aerosols must be considered further before any attempts are made at SRM .
In order to be effective, SRM efforts involving stratospheric aerosols will require a global effort. Such an action would represent the first truly global and intentionally-produced human exposures, and because the benefits and potential consequences of this action would impact the entire population of the planet to some degree, we make the following initial recommendations:
Geoengineering cost-benefit analyses should consider health impacts of SRM.
At present, most assessments of geoengineering are done within specific and well-defined frameworks of economics, risk, politics, and environmental ethics . Literature on the potential human health impacts of SRM is scant, and such impacts have not been adequately factored into previous cost-benefit analyses . We recommend that subsequent cost-benefit analyses for geoengineering explicitly consider health impacts of SRM . Assessments should further compare the expected health benefits that may result from SRM efforts to potential adverse health outcomes, including (but not limited to) those described here.
Further research is needed on methods of assessment of exposures to, and evaluation of toxicological properties of, potential SRM materials.
We have noted gaps in current scientific knowledge related to occupational and community exposures that would result from SRM, as well to the toxicological properties of potential SRM materials. Additional laboratory- and field-based research is needed in these areas, particularly with regard to exposure characterization and the spatial and temporal movement of SRM materials from the stratosphere to ground level. While it is difficult to develop exposure and toxicological models which are representative of a decades- or centuries-long SRM deployment, these efforts are critical to ensure that reasonable, validated models of exposures and human health impacts are available prior to any SRM deployment.
Strict and harmonized global occupational and community exposure limits are needed for SRM materials.
Tables 2 and and33 illustrate the divergence and incompleteness of current occupational and community exposure limits regarding potential SRM materials. Since exposures will inherently be global in nature, exposure limits must be harmonized to ensure that individuals around the world are given equal protection from adverse health effects. Global harmonization of standards related to SRM represents an immense but necessary bureaucratic and scientific challenge, and an important step towards establishing a formal governance framework for geoengineering. A global discussion of standards harmonization relating to SRM may result in other tangible benefits to society, including the potential evolution of a common language and framework for risk assessment and a debate on the strengths and weaknesses of different approaches to risk management.
Reversal mechanisms should be identified prior to any SRM deployment
In the event that substantial health impacts are noted following deployment of stratospheric aerosol approaches to SRM, mechanisms for capturing the aerosols to halt further ground-level exposures through gravitational deposition will be needed. Therefore, if stratospheric aerosols are pursued as a viable SRM strategy, such mechanisms will need to be identified and evaluated prior to large-scale deployment.
Although there is very little agreement in the scientific community on the approach to SRM-related technologies, SRM has been identified as a potentially technically feasible and possibly cost-effective method of geoengineering to reduce or reverse anthropogenically-driven climate change [1, 62]. But even as much is being done to unravel the scientific and technical challenges around geoengineering, and there is substantial evidence that a host of adverse human health effects will directly result from climate change, very little has been done to describe the potential human health impacts of this emerging disruptive technology. We have described the potential occupational and public health impacts of inadvertent exposure to potential SRM materials, and have also speculated on the possible health impacts of exposure to barium titanate using knowledge of similar nanomaterials.
Based on our analyses, we submit that the current knowledge gaps do not justify deployment of SRM in the short term. We therefore recommend further research, a more inclusive analysis of costs and benefits, as well as the globalization and harmonization of regulatory standards that will limit the negative human health impact of SRM. Only following a comprehensive risk assessment that addresses each of these issues can the potential benefits of this geoengineering approach be weighed against the potential public health burdens created by this technology.
Funding for this study was provided by the University of Michigan MCubed funding program and by the University of Michigan Risk Science Center.
|ACGIH||American Conference of Governmental Industrial Hygienist|
|ATSDR||U.S Agency for Toxic Substances and Disease Registry|
|CDR||Carbon Dioxide Removal|
|EPA||U.S Environmental Protection Agency|
|IPCC||Intergovernmental Panel on Climate Change|
|NIOSH||National Institute for Occupational Safety and Health|
|OSHA||Occupational Safety and Health Administration|
|PPE||Personal Protective Equipment|
|REL||Recommended Exposure Limits|
|SRM||Solar Radiation Management|
|TLV||Threshold Limit Values|
Possible Relationship between COVID-19 multi symptoms, and Antimicrobial resistance in the food chain?
Antimicrobial resistance in the food chain
How does the use of antibiotics in food-producing animals lead to antimicrobial resistance in humans?
The high volume of antibiotics in food-producing animals contributes to the development of antimicrobial-resistant bacteria, particularly in settings of intensive animal production. In some countries, the total amount of antibiotics used in animals is 4 times larger than the amount used in humans. In many countries much of the antibiotics used in animals are for growth promotion and prevention of disease, not to treat sick animals.
These bacteria can be transmitted from animals to humans via direct contact between animals and humans, or through the food chain and the environment. Antimicrobial-resistant infections in humans can cause longer illnesses, increased frequency of hospitalization, and treatment failures that can result in death. Some types of bacteria that cause serious infections in humans have already developed resistance to most or all of the available treatments and we are running out of treatment options for some types of infection. WHO recommends an overall reduction in use of antibiotics in food-producing animals to help preserve their effectiveness for human medicine.
Why is antimicrobial resistance important for food safety?
Many of the bacteria (such as Salmonella, Campylobacter and Escherichia coli) carried by animals can also cause disease in people. These bacteria, which are frequently antimicrobial-resistant, can contaminate our food supply from farm to fork, such as through slaughtering and processing. Fruits and vegetables may also be contaminated by such bacteria at the farm or later through cross-contamination. We know about this because we can link drug-resistant bacteria isolated from sick people to an agricultural source through DNA fingerprinting.
Over 400 000 people die each year from foodborne diseases, with over one-third of these deaths occurring in children under 5 years of age. The vast majority of foodborne illnesses are caused by microbes, including bacteria, according to WHO estimates . If these bacteria become resistant to antibiotics, it will become impossible to treat them and more people will die from foodborne diseases.
Why is WHO providing advice to the agriculture sector?
WHO is providing advice to the agriculture sector because the most effective way to prevent the transmission of antimicrobial-resistant bacteria from food-producing animals to humans is by preventing the emergence and dissemination of antimicrobial-resistant bacteria in food-producing animals. WHO’s mandate is to build a better, healthier future for people all over the world. This includes protecting people from health threats due to antimicrobial-resistant infections and unsafe food.
Antimicrobial resistance is a major threat to human health. Optimizing the use of antibiotics in both human medicine and animal husbandry will help slow down its emergence and spread. These WHO guidelines were developed to reduce this important public health threat to preserve the effectiveness of antibiotics important for human health. The recommendations are based on WHO’s list of critically important antimicrobials for human medicine, with the goal to safeguard all, especially critically important antibiotics. to treat multi-drug resistant infections in humans.
Who are the important actors that can help implement the recommendations in the Guidelines?
These guidelines are relevant to every country, regardless of region, income and setting. The primary audience of these guidelines is policy makers and regulatory officials overseeing the use of antibiotics in food-producing animals. In addition, veterinarians, food animal organizations, food producers, pharmaceutical companies, animal health and public health officials, physicians and other health providers all have a role to play. Consumers also have a strong influence on the way foods are produced and are driving the market for meat produced without routine use of antibiotics in some countries. For example, Namibia has developed a strong export market for its beef since it introduced a ban on the use of antibiotics for growth promotion.
Read more: WHO Food Safety
Fate of Pirlimycin and Antibiotic‐Resistant Fecal Coliforms in Field Plots Amended with Dairy Manure or Compost during Vegetable Cultivation
First published: 01 May 2018
- PMID: 29864178
- DOI: 10.2134/jeq2017.12.0491
Identification of agricultural practices that mitigate the environmental dissemination of antibiotics is a key need in reducing the prevalence of antibiotic‐resistant bacteria of human health concern. Here, we aimed to compare the effects of crop (lettuce [Lactuca sativa L.] or radish [Raphanus sativus L.]), soil amendment type (inorganic fertilizer, raw dairy manure, composted dairy manure, or no amendment), and prior antibiotic use history (no antibiotics during previous lactation cycles vs. manure mixed from cows administered pirlimycin or cephapirin) of manure‐derived amendments on the incidence of culturable antibiotic‐resistant fecal coliforms in agricultural soils through a controlled field‐plot experiment. Antibiotic‐resistant culturable fecal coliforms were recoverable from soils across all treatments immediately after application, although persistence throughout the experiment varied by antibiotic class and time. The magnitude of observed coliform counts differed by soil amendment type. Compost‐amended soils had the highest levels of cephalosporin‐resistant fecal coliforms, regardless of whether the cows from which the manure was derived were administered antibiotics. Samples from control plots or those treated with inorganic fertilizer trended toward lower counts of resistant coliforms, although these differences were not statistically significant. No statistical differences were observed between soils that grew leafy (lettuce) versus rooted (radish) crops. Only pirlimycin was detectable past amendment application in raw manure‐amended soils, dissipating 12 to 25% by Day 28. Consequently, no quantifiable correlations between coliform count and antibiotic magnitude could be identified. This study demonstrates that antibiotic‐resistant fecal coliforms can become elevated in soils receiving manure‐derived amendments, but that a variety of factors likely contribute to their long‐term persistence under typical field conditions.
- Pirlimycin was only detected in soils amended with raw dairy manure.
- Persistence of ARBs in soil was dependent on antibiotic class and time.
- Compost‐amended soils had the highest levels of fecal coliforms resistant to cephalosporins.
- Antibiotic presence may not be the sole driver of resistance, as ARBs were found in all soils.
Gamma Irradiation Influences the Survival and Regrowth of Antibiotic-Resistant Bacteria and Antibiotic-Resistance Genes on Romaine Lettuce. Front Microbiol. 2019 Apr 9;10:710. doi: 10.3389/fmicb.2019.00710. eCollection 2019.PMID: 31024491
Effects of Dairy Manure-Based Amendments and Soil Texture on Lettuce- and Radish-Associated Microbiota and Resistomes. mSphere. 2019 May 8;4(3):e00239-19. doi: 10.1128/mSphere.00239-19.PMID: 31068435 Free PMC article.
Fecal Indicator Bacteria and Antibiotic Resistance Genes in Storm Runoff from Dairy Manure and Compost-Amended Vegetable Plots. J Environ Qual. 2019 Jul;48(4):1038-1046. doi: 10.2134/jeq2018.12.0441.PMID: 31589689
Effect of composting and soil type on dissipation of veterinary antibiotics in land-applied manures. Chemosphere. 2018 Apr;196:270-279. doi: 10.1016/j.chemosphere.2017.12.161. Epub 2017 Dec 26.PMID: 29306199
Importance of Soil Amendments: Survival of Bacterial Pathogens in Manure and Compost Used as Organic Fertilizers. Microbiol Spectr. 2016 Aug;4(4). doi: 10.1128/microbiolspec.PFS-0010-2015.PMID: 27726763 Review.
Invited review: Fate of antibiotic residues, antibiotic-resistant bacteria, and antibiotic resistance genes in US dairy manure management systems. J Dairy Sci. 2020 Feb;103(2):1051-1071. doi: 10.3168/jds.2019-16778. Epub 2019 Dec 16.PMID: 31837779 Review.
Possible Relationship between COVID-19 multi symptoms, false positive and negative testing within the abundance and diversity of resistomes that differ between healthy human oral cavities and gut?
Abundance and diversity of resistomes differ between healthy human oral cavities and gut.
The global threat of antimicrobial resistance has driven the use of high-throughput sequencing techniques to monitor the profile of resistance genes, known as the resistome, in microbial populations. The human oral cavity contains a poorly explored reservoir of these genes. Here we analyse and compare the resistome profiles of 788 oral cavities worldwide with paired stool metagenomes. We find country and body site-specific differences in the prevalence of antimicrobial resistance genes, classes and mechanisms in oral and stool samples. Within individuals, the highest abundances of antimicrobial resistance genes are found in the oral cavity, but the oral cavity contains a lower diversity of resistance genes compared to the gut. Additionally, co-occurrence analysis shows contrasting ARG-species associations between saliva and stool samples. Maintenance and persistence of antimicrobial resistance is likely to vary across different body sites. Thus, we highlight the importance of characterising the resistome across body sites to uncover the antimicrobial resistance potential in the human body.
In recent years, antimicrobial resistance (AMR) has been highlighted as one of the biggest threats to global health, food production and economic development1. Given this rapidly developing global crisis, it is imperative that the current gaps in our understanding of the distribution, spread and associations of all AMR factors are filled. AMR is most often conferred through the expression of antimicrobial resistance genes (ARGs) that reduce a microbe’s susceptibility to the effects of an antimicrobial compound. As such, monitoring the abundance and diversity of these ARG profiles, or the resistome, has huge potential to increase our understanding of the spread and persistence of AMR within a population. High-throughput next-generation sequencing technologies are beginning to be used as tools for screening ARGs for potential surveillance of antimicrobial resistance worldwide. Shotgun metagenomic data mapped against dedicated ARG reference databases are providing a wealth of insight into the resistomes of human2,3,4,5,6,7 and animal guts8,9, as well as the wider environment10,11,12,13. However, no large studies have, to date, attempted to characterise the resistome profiles of the human oral cavity. Commensal microbes from the oral cavity harbouring ARGs have potential to lead to antimicrobial resistant infections at other body sites. For example, β-lactam, clindamycin and erythromycin resistant strains of oral streptococci have caused infections at distal body sites such as infective endocarditis14.
Metagenomic studies of the oral cavity indicate that this site potentially contains a diverse range of ARGs, including those encoding resistance to tetracycline, amoxycillin and gentamicin in saliva and plaque samples15,16. Thus, oral ARGs appear to be natural features of the human oral cavity. The presence of an oral resistome containing aminoglycoside, β-lactam, macrolide, phenicol and tetracycline ARGs in isolated Amerindian communities and ancient humans, indicates that the presence of these genes is not dependent on antibiotic exposure and is an inherent feature of the oral microbiome17,18.
The oral microbial community faces unique ecological pressures, such as mechanical force, nutritional availability, pH levels, oxidative stress and redox potential. Despite these continually changing conditions, these communities have been shown to be relatively stable even after short-term antibiotic exposure. Horizontal gene transfer (HGT) has been documented as an important mechanism for the transfer and acquisition of ARGs within and between oral bacterial species19,20. The erythromycin resistance mefA and mefE genes have been found on the MEGA mobile genetic element associated with Tn916-like conjugative transposons (also called integrative conjugative elements ICE), and this has been implicated in conjugative transfer between viridans group streptococci (VGS) and other streptococci21. Thus, the oral microbiome contains a long-standing and mobile population of ARGs and is a significant reservoir for ARGs to be transferred to pathogenic microbes.
Here, we derive and compare the oral and the gut resistomes from 788 and 386 shotgun metagenomes, respectively, from healthy individuals from China22, Fiji23, the Philippines24, Western Europe25,26,27 and the USA28. We found country-specific differences in the proportion of saliva, dental plaque and stool samples containing ARGs, ARG classes and mechanisms. We made up to 415 comparisons of oral resistomes with paired gut resistomes derived from stool shotgun metagenomes from the same individuals, showing the oral resistome contains the highest and lowest abundances of ARGs, but a lower diversity of ARGs than the gut resistome. Overall, these results demonstrate the requirement for wider AMR surveillance studies at different body sites, including the oral cavity, to understand the composition of the resistome across different human microbial habitats.
Country and body site-specific differences in resistomes
To establish the incidence of ARGs in oral as well as stool metagenomes collected from various regions, metagenomes were mapped and quantified against the Comprehensive Antibiotic Resistance Database (CARD)29. Saliva samples were only available from China, Fiji, the Philippines and Western Europe. To account for the differences in read depths across different data sets, the samples were subsampled to the same number of reads across cohorts for absolute ARG incidence measures. The percentages of saliva samples that contain at least one ARG for each class and mechanism from these cohorts were evaluated. To account for varying read depth across samples, the samples were subsampled to the same number of sequences. Saliva samples from China, Fiji, the Philippines and Western Europe contain 20, 14, 23 and 17 ARG classes, respectively (Fig. 1a). Furthermore, ARG classes are found in Philippines saliva samples, but most of this variability originates from one individual: a farmer from Zambal who has carbapenem, fosfomycin, rifamycin and triclosan ARG classes24. All or almost all saliva samples from every cohort contain cephamycin, fluoroquinolone, lincosamide, macrolide, streptogramin and tetracycline ARGs, and a high percentage (above 50%) of saliva samples from all cohorts contain pleuromutilin ARGs. Unlike most cohorts, all saliva samples from China contain aminoglycoside ARGs represented by one ARG, APH(3’)-Ia, and also a high proportion of these samples contain glycylcycline represented by one ARG, tet(A) (Supplementary Fig. 1a). The peptide ARG class is only found in saliva from Chinese and Philippines individuals. Mechanisms of antimicrobial resistance including antibiotic efflux, inactivation, target alteration and target protection are present in all saliva samples across all cohorts (Fig. 1b), whilst the antibiotic target replacement mechanism is found in China, Philippines and Western Europe, but not in Fiji. Reduced permeability to antibiotics is only found in saliva from the same farmer in Zambal.
Possible Relationship between COVID-19 multi symptoms, Antimicrobial infections and Hospital born diseases such as the present C-DIFFICILE 2019 epidemic?
Biggest Threats and Data
2019 AR Threats Report
CDC’s Antibiotic Resistance Threats in the United States, 2019 (2019 AR Threats Report) includes the latest national death and infection estimates that underscore the continued threat of antibiotic resistance in the U.S.
According to the report, more than 2.8 million antibiotic-resistant infections occur in the U.S. each year, and more than 35,000 people die as a result. In addition, 223,900 cases of Clostridioides difficile occurred in 2017 and at least 12,800 people died.
Dedicated prevention and infection control efforts in the U.S. are working to reduce the number of infections and deaths caused by antibiotic-resistant germs, but the number of people facing antibiotic resistance is still too high. More action is needed to fully protect people.
CDC is concerned about rising resistant infections in the community, which can put more people at risk, make spread more difficult to identify and contain, and threaten the progress made to protect patients in healthcare. The emergence and spread of new forms of resistance remains a concern.
The report lists 18 antibiotic-resistant bacteria and fungi into three categories based on level of concern to human health—urgent, serious, and concerning—and highlights:
- Estimated infections and deaths since the 2013 report
- Aggressive actions taken
- Gaps slowing progress
The report also includes a Watch List with three threats that have not spread resistance widely in the U.S. but could become common without a continued aggressive approach.
Global burden of Clostridium difficile infections: a systematic review and meta-analysis
Clostridium difficile is a leading cause of morbidity and mortality in several countries. However, there are limited evidence characterizing its role as a global public health problem. We conducted a systematic review to provide a comprehensive overview of C. difficile infections (CDI) rates.
Seven databases were searched (January 2016) to identify studies and surveillance reports published between 2005 and 2015 reporting CDI incidence rates. CDI incidence rates for health care facility-associated (HCF), hospital onset-health care facility-associated, medical or general intensive care unit (ICU), internal medicine (IM), long-term care facility (LTCF), and community-associated (CA) were extracted and standardized. Meta-analysis was conducted using a random effects model.
229 publications, with data from 41 countries, were included. The overall rate of HCF-CDI was 2.24 (95% confidence interval CI = 1.66-3.03) per 1000 admissions/y and 3.54 (95%CI = 3.19-3.92) per 10 000 patient-days/y. Estimated rates for CDI with onset in ICU or IM wards were 11.08 (95%CI = 7.19-17.08) and 10.80 (95%CI = 3.15-37.06) per 1000 admission/y, respectively. Rates for CA-CDI were lower: 0.55 (95%CI = 0.13-2.37) per 1000 admissions/y. CDI rates were generally higher in North America and among the elderly but similar rates were identified in other regions and age groups.
Our review highlights the widespread burden of disease of C. difficile, evidence gaps, and the need for sustainable surveillance of CDI in the health care setting and the community.
Clostridium difficile is a leading cause of health care-associated infections (HAIs) and an important public health threat. C. difficile has been associated with substantial morbidity and mortality worldwide and among individuals of all ages beyond the traditionally recognized at-risk groups (eg, elderly, hospitalized patients, or those under antimicrobial therapy) . In the United States, C. difficile caused an estimated half a million infections and 29 000 deaths in 2012 . Almost two thirds of these were associated with inpatient care, and more than 80% of these deaths occurred in those 65 years and over . Challenges remain to control incidence of C. difficile infections (CDI) rates in other world regions such as Europe [3–5] and Asia . It is estimated that approximately 40 000 cases among inpatients are potentially underdiagnosed each year in Europe . Furthermore, recurrence of CDI is estimated to occur among a considerable percentage of cases (approximately 20%-30%) . Though the global health care costs associated with CDI are not known, these are likely to be substantial with estimates suggesting attributable costs of US$ 5.4-6.3 billion per year in the United States .
Considering the evolving epidemiology of C. difficile morbidity and mortality, global challenges regarding antibiotic stewardship and limited alternative preventative options for CDI, it is important to assess its burden to inform public health action. The emergence of hyper virulent strain PCR ribotype 027/NAP1, significant increases in incidence of hospitalizations associated with C. difficile by the mid-2000s, and outbreaks of CDI in hospitals globally are examples of the major impact C. difficile can have on health care systems [1,9–12]. Through the implementation of standardized surveillance case definitions, a considerable proportion of cases of CDI occurring outside hospitals has also been identified [1,13–15]. Increased awareness of the role of C. difficile as a global health problem is required to reduce morbidity and control rates of CDI [16,17]. However, there are limited evidence characterizing its role as a global public health problem. We aimed to conduct a comprehensive examination of globally reported rates of CDI incidence, in order to develop baseline epidemiological estimates of incidence and identify characteristics and gaps in the available evidence base.
The evolving epidemiology of Clostridium difficile infection in Canadian hospitals during a postepidemic period (2009–2015)
Background: The clinical and molecular epidemiology of health care–associated Clostridium difficile infection in nonepidemic settings across Canada has evolved since the first report of the virulent North American pulsed-field gel electrophoresis type 1 (NAP1) strain more than 15 years ago. The objective of this national, multicentre study was to describe the evolving epidemiology and molecular characteristics of health care–associated C. difficile infection in Canada during a post-NAP1-epidemic period, particularly patient outcomes associated with the NAP1 strain.
Methods: Adult inpatients with C. difficile infection were prospectively identified, using a standard definition, between 2009 and 2015 through the Canadian Nosocomial Infection Surveillance Program (CNISP), a network of 64 acute care hospitals. Patient demographic characteristics, severity of infection and outcomes were reviewed. Molecular testing was performed on isolates, and strain types were analyzed against outcomes and epidemiologic trends.
Results: Over a 7-year period, 20 623 adult patients admitted to hospital with health care–associated C. difficile infection were reported to CNISP, and microbiological data were available for 2690 patients. From 2009 to 2015, the national rate of health care–associated C. difficile infection decreased from 5.9 to 4.3 per 10 000 patient-days. NAP1 remained the dominant strain type, but infection with this strain has significantly decreased over time, followed by an increasing trend of infection with NAP4 and NAP11 strains. The NAP1 strain was significantly associated with a higher rate of death attributable to C. difficile infection compared with non-NAP1 strains (odds ratio 1.91, 95% confidence interval [CI] 1.29–2.82). Isolates were universally susceptible to metronidazole; one was nonsusceptible to vancomycin. The proportion of NAP1 strains within individual centres predicted their rates of health care–associated C. difficile infection; for every 10% increase in the proportion of NAP1 strains, the rate of health care–associated C. difficile infection increased by 3.3% (95% CI 1.7%–4.9%).
Interpretation: Rates of health care–associated C. difficile infection have decreased across Canada. In nonepidemic settings, NAP4 has emerged as a common strain type, but NAP1, although decreasing, continues to be the predominant circulating strain and remains significantly associated with higher attributable mortality.
Clostridium difficile infection is the most common infectious cause of health care–associated diarrhea among hospital-admitted patients in developed countries and can lead to substantial morbidity and mortality.1,2 In 2002, an outbreak in Quebec, Canada, demonstrated the emergence of a virulent strain type known as the North American pulsed-field gel electrophoresis type 1 (NAP1; associated with ribotype 027).3 This strain type was associated with increases in the number of C. difficile outbreaks with higher rates of death and recurrence.4–8 However, the data suggesting that the NAP1 strain is associated with more complicated disease were primarily based on studies related to larger institutional and regional outbreaks.7,9–11 Since other strains have also been found to be associated with complicated outcomes,12,13 the association of NAP1 with severe disease or higher rates of health care–associated C. difficile infection in nonepidemic settings is less clear.
In Canada, national-level data on health care–associated C. difficile infection are collected through the Canadian Nosocomial Infection Surveillance Program (CNISP), a collaborative effort of the Public Health Agency of Canada and sentinel hospitals across the country that participate as members of the Canadian Hospital Epidemiology Committee, a subcommittee of the Association of Medical Microbiology and Infectious Disease Canada. The data collected provide a measure of the burden of illness, establish benchmarks for comparison and identify potential trends of disease. The objectives of this national, multicentre study were to describe the evolving epidemiology and molecular characteristics of health care–associated C. difficile infection in Canada during a postepidemic period, and to examine the effect of NAP1 strain type on patient outcomes and institutional rates of health care–associated C. difficile infection over time.
Read more: https://www.cmaj.ca/content/190/25/E758
Connecting the dots