Epidemiological studies

流行病学研究
  • 文章类型: Journal Article
    本研究使用时空流行病学方法探索了由于COVID-19引起的纳米化学感染性颗粒在水环境中的动态传播。我们提出了一种新的多智能体模型,通过考虑几个影响因素来模拟COVID-19的传播。该模型将人群分为易感人群和感染人群,分析了不同防控措施的影响,例如限制人数和戴口罩对COVID-19的传播。研究结果表明,降低人口密度和戴口罩可以显着降低病毒传播的可能性。具体来说,研究表明,如果人口在固定范围内流动,几乎每个人最终都会在1小时内被感染。当人口密度为50%时,感染率高达96%。如果每个人都不戴口罩,近72.33%的人会在1小时后被感染。然而,当人们戴口罩时,感染率始终低于不戴口罩时的感染率。即使只有25%的人戴口罩,使用口罩的感染率比没有口罩的感染率低27.67%,这有力地证明了戴口罩的重要性。由于人们的日常活动大多在室内进行,而很多新冠疫情的超级传播事件也源于室内聚会,室内疫情防控研究至关重要。这项研究为流行病的预防和控制提供了决策支持,所提出的方法可用于其他地区和未来的流行病。
    This study explores the dynamic transmission of infectious particles due to COVID-19 in the environment using a spatiotemporal epidemiological approach. We proposed a novel multi-agent model to simulate the spread of COVID-19 by considering several influencing factors. The model divides the population into susceptible and infected and analyzes the impact of different prevention and control measures, such as limiting the number of people and wearing masks on the spread of COVID-19. The findings suggest that reducing population density and wearing masks can significantly reduce the likelihood of virus transmission. Specifically, the research shows that if the population moves within a fixed range, almost everyone will eventually be infected within 1 h. When the population density is 50%, the infection rate is as high as 96%. If everyone does not wear a mask, nearly 72.33% of the people will be infected after 1 h. However, when people wear masks, the infection rate is consistently lower than when they do not wear masks. Even if only 25% of people wear masks, the infection rate with masks is 27.67% lower than without masks, which is strong evidence of the importance of wearing a mask. As people\'s daily activities are mostly carried out indoors, and many super-spreading events of the new crown epidemic also originated from indoor gatherings, the research on indoor epidemic prevention and control is essential. This study provides decision-making support for epidemic preventions and controls and the proposed methodology can be used in other regions and future epidemics.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    Application of remote sensing-based metrics of exposure to vegetation in epidemiological studies of residential greenness is typically limited to several standard products. The Normalized Difference Vegetation Index (NDVI) is the most widely used, but its precision varies with vegetation density and soil color/moisture. In areas with heterogeneous vegetation cover, the Soil-adjusted Vegetation Index (SAVI) corrects for soil brightness. Linear Spectral Unmixing (LSU), measures the relative contribution of different land covers, and estimates percent of each over a unit area. We compared the precision of NDVI, SAVI and LSU for quantifying residential greenness in areas with high spatial heterogeneity in vegetation cover.
    NDVI, SAVI, and LSU in a 300 m radius surrounding homes of 3,188 cardiac patients living in Israel (Eastern Mediterranean) were derived from Landsat 30 m spatial resolution imagery. Metrics were compared to assess shifts in exposure quartiles and differences in vegetation detection as a function of overall greenness, climatic zones, and population density, using NDVI as the reference method.
    For the entire population, the dispersion (SD) of the vegetation values detected was 60% higher when greenness was measured using LSU compared to NDVI: mean (SD) NDVI: 0.17 (0.05), LSU (%): 0.23 (0.08), SAVI: 0.12 (0.03). Importantly, with an increase in population density, the sensitivity of LSU, compared to NDVI, doubled: There was a 95% difference between the LSU and NDVI interquartile range in the highest population density quartile vs 47% in the lowest quartile. Compared to NDVI, exposures estimated by LSU resulted in 21% of patients changing exposure quartiles. In urban areas, the shift in exposure quartile depended on land cover characteristics. An upward shift occurred in dense urban areas, while no shift occurred in high and low vegetated urban areas.
    LSU was shown to outperform the commonly used NDVI in terms of accuracy and variability, especially in dense urban areas. Therefore, LSU potentially improves exposure assessment precision, implying reduced exposure misclassification.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Sci-hub)

  • 文章类型: Journal Article
    行政数据越来越多地用于医疗保健研究。然而,为了避免偏见,它们的使用需要仔细的研究计划。本文介绍了流行病学研究中使用的方法学原则和标准,在意大利最大的地区因心力衰竭(HF)住院的患者的结果和护理过程,从2000年到2012年。
    数据是从伦巴第医疗保健系统的行政数据仓库中提取的,意大利。带有HF相关诊断代码的出院表格是将HF住院确定为临床事件的基础。或情节。在经历至少一个HF事件的患者中,因任何原因住院,门诊服务利用,和药物处方也进行了分析。
    70万,从2000年到2012年,共记录了7101起心力衰竭事件,涉及371,766名患者.一旦在第一次HF事件后为这些患者提供的所有医疗保健服务都结合在一起,研究数据库总计约9100万条记录。原则,描述了用于最小化错误和表征一些相关子组的标准和提示。
    这项研究的方法可以代表未来研究的基础,可以应用于类似的流行病学研究,趋势分析,和医疗资源利用。
    Administrative data are increasingly used in healthcare research. However, in order to avoid biases, their use requires careful study planning. This paper describes the methodological principles and criteria used in a study on epidemiology, outcomes and process of care of patients hospitalized for heart failure (HF) in the largest Italian Region, from 2000 to 2012.
    Data were extracted from the administrative data warehouse of the healthcare system of Lombardy, Italy. Hospital discharge forms with HF-related diagnosis codes were the basis for identifying HF hospitalizations as clinical events, or episodes. In patients experiencing at least one HF event, hospitalizations for any cause, outpatient services utilization, and drug prescriptions were also analyzed.
    Seven hundred one thousand, seven hundred one heart failure events involving 371,766 patients were recorded from 2000 to 2012. Once all the healthcare services provided to these patients after the first HF event had been joined together, the study database totalled about 91 million records. Principles, criteria and tips utilized in order to minimize errors and characterize some relevant subgroups are described.
    The methodology of this study could represent the basis for future research and could be applied in similar studies concerning epidemiology, trend analysis, and healthcare resources utilization.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Sci-hub)

       PDF(Pubmed)

  • 文章类型: Journal Article
    BACKGROUND: We addressed if immaturity relative to peers reflected in birth month increases the likelihood of ADHD diagnosis and treatment.
    METHODS: We linked nationwide Patient and Prescribed Drug Registers and used prospective cohort and nested case-control designs to study 6-69 year-old individuals in Sweden from July 2005 to December 2009 (Cohort 1). Cohort 1 included 56,263 individuals diagnosed with ADHD or ever used prescribed ADHD-specific medication. Complementary population-representative cohorts provided DSM-IV ADHD symptom ratings; parent-reported for 10,760 9-year-old twins born 1995-2000 from the CATSS study (Cohort 2) and self-reported for 6,970 adult twins age 20-47 years born 1959-1970 from the STAGE study (Cohort 3). We calculated odds ratios (OR:s) for ADHD across age for individuals born in November/December compared to January/February (Cohort 1). ADHD symptoms in Cohorts 2 and 3 were studied as a function of calendar birth month.
    RESULTS: ADHD diagnoses and medication treatment were both significantly more common in individuals born in November/December versus January/February; peaking at ages 6 (OR: 1.8; 95% CI: 1.5-2.2) and 7 years (OR: 1.6; 95% CI: 1.3-1.8) in the Patient and Prescribed Drug Registers, respectively. We found no corresponding differences in parent- or self-reported ADHD symptoms by calendar birth month.
    CONCLUSIONS: Relative immaturity compared to class mates might contribute to ADHD diagnosis and pharmacotherapy despite absence of parallel findings in reported ADHD symptom loads by relative immaturity. Increased clinical awareness of this phenomenon may be warranted to decrease risk for imprecise diagnostics and treatment. We speculate that flexibility regarding age at school start according to individual maturity could reduce developmentally inappropriate demands on children and improve the precision of ADHD diagnostic practice and pharmacological treatment.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Sci-hub)

  • 文章类型: Journal Article
    There is major concern about coumarins interacting with various drug classes and increasing the risk of overanticoagulation. The aim of the study was to assess bleeding risk in patients with concurrent use of antibiotics and phenprocoumon, the most widely prescribed coumarin in many European countries. We conducted a nested-case-control study within a cohort of 513,338 incident and continuous phenprocoumon users ≥ 18 years of age using claims data of the statutory health insurance company AOK, covering 30% of the German population. Bleeding risk associated with current use of antibiotics for systemic use (antibacterials/antimycotics) was calculated using conditional logistic regression in 13,785 cases with a bleeding event and 55,140 risk-set sampling-matched controls. Bleeding risk associated with any antibacterial use in phenprocoumon users was significantly increased [odds ratio (OR) 2.37, 95% confidence interval (CI) 2.20-2.56]. The association was stronger for gastrointestinal than for cerebral bleeding (OR 2.09, 95% CI 1.84-2.38 and OR 1.34, 95% CI 1.03-1.74, respectively) and highest for other/unspecified bleeding (OR 2.92, 95% CI 2.62-3.26). Specific antibiotic classes were strongly associated with bleeding risk, e.g. cotrimoxazole (OR 3.86, 95% CI 3.08-4.84) and fluorquinolones (OR 3.13, 95% CI 2.74-3.59), among those highest for ofloxacin (OR 5.00, 95% CI 3.01-8.32). Combined use of phenprocoumon and antimycotics was not significantly associated with bleeding risk. Risk was not significantly modified by age (pint=0.25) or sex (pint=0.96). The association was stronger the closer the antibiotic exposure was to the bleeding event. Among continuous phenprocoumon users, antibiotics - particularly quinolones and cotrimoxazole - should be prescribed after careful consideration due to an increased bleeding risk. Close monitoring of international normalised ratio levels after prescription is recommended.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Sci-hub)

公众号