Computing Methodologies

计算方法
  • 文章类型: Journal Article
    COVID-19和结构性种族主义的双重流行病使人们关注健康差异以及疾病对有色人种社区的不成比例的影响。卫生公平随后成为一个优先事项。认识到医疗保健的未来将由包括人工智能(AI)在内的先进信息技术提供信息,机器学习,和算法应用,作者认为,为了朝着改善健康公平的状态前进,健康信息专业人员需要参与和鼓励在健康公平的交叉点进行研究,健康差异,和计算生物医学知识(CBK)应用。建议提供了参与这一动员工作的手段。
    The twin pandemics of COVID-19 and structural racism brought into focus health disparities and disproportionate impacts of disease on communities of color. Health equity has subsequently emerged as a priority. Recognizing that the future of health care will be informed by advanced information technologies including artificial intelligence (AI), machine learning, and algorithmic applications, the authors argue that to advance towards states of improved health equity, health information professionals need to engage in and encourage the conduct of research at the intersections of health equity, health disparities, and computational biomedical knowledge (CBK) applications. Recommendations are provided with a means to engage in this mobilization effort.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    目的:介绍量子计算技术作为生物医学研究的工具,并强调未来在医疗保健领域的应用。专注于它的能力,好处,和限制。
    背景:寻求探索量子计算并为医疗保健和生物医学研究创建基于量子的应用的研究人员。
    方法:量子计算需要专门的硬件,被称为量子处理单元,使用量子位(量子位)而不是经典位来执行计算。本文将涵盖(1)量子计算为生物医学中的经典计算提供优势的拟议应用;(2)介绍量子计算机如何操作,为生物医学研究人员量身定制;(3)扩大了量子计算的最新进展;(4)挑战,机遇,并提出了在生物医学应用中集成量子计算的解决方案。
    OBJECTIVE: To introduce quantum computing technologies as a tool for biomedical research and highlight future applications within healthcare, focusing on its capabilities, benefits, and limitations.
    BACKGROUND: Investigators seeking to explore quantum computing and create quantum-based applications for healthcare and biomedical research.
    METHODS: Quantum computing requires specialized hardware, known as quantum processing units, that use quantum bits (qubits) instead of classical bits to perform computations. This article will cover (1) proposed applications where quantum computing offers advantages to classical computing in biomedicine; (2) an introduction to how quantum computers operate, tailored for biomedical researchers; (3) recent progress that has expanded access to quantum computing; and (4) challenges, opportunities, and proposed solutions to integrate quantum computing in biomedical applications.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    鸡的行为识别是至关重要的,原因有很多,包括促进动物福利,确保健康问题的早期发现,优化农场管理实践,并为更可持续和道德的家禽养殖做出贡献。在本文中,我们介绍了一种基于视频感知镶嵌的边缘计算设备上的鸡的行为识别技术。我们的方法将视频感知镶嵌与深度学习相结合,可以从视频中准确识别特定的鸡行为。它达到了惊人的准确性,用MobileNetV2对表现出三种行为的鸡达到79.61%。这些发现强调了我们的方法在边缘计算设备上进行鸡行为识别的有效性和前景。使其适应不同的应用。不断探索和识别各种行为模式将有助于更全面地了解鸡的行为,提高不同背景下行为分析的范围和准确性。
    Chicken behavior recognition is crucial for a number of reasons, including promoting animal welfare, ensuring the early detection of health issues, optimizing farm management practices, and contributing to more sustainable and ethical poultry farming. In this paper, we introduce a technique for recognizing chicken behavior on edge computing devices based on video sensing mosaicing. Our method combines video sensing mosaicing with deep learning to accurately identify specific chicken behaviors from videos. It attains remarkable accuracy, achieving 79.61% with MobileNetV2 for chickens demonstrating three types of behavior. These findings underscore the efficacy and promise of our approach in chicken behavior recognition on edge computing devices, making it adaptable for diverse applications. The ongoing exploration and identification of various behavioral patterns will contribute to a more comprehensive understanding of chicken behavior, enhancing the scope and accuracy of behavior analysis within diverse contexts.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    2019年冠状病毒病(COVID-19)大流行继续对公共卫生部门构成重大挑战,包括阿拉伯联合酋长国(UAE)。这项研究的目的是评估各种深度学习模型在预测阿联酋境内COVID-19病例中的效率和准确性。从而帮助国家的公共卫生当局在知情的决策。
    这项研究利用了一个全面的数据集,包括确诊的COVID-19病例,人口统计,和社会经济指标。几种先进的深度学习模型,包括长短期记忆(LSTM),双向LSTM,卷积神经网络(CNN)CNN-LSTM,多层感知器,和递归神经网络(RNN)模型,进行了培训和评估。还实施了贝叶斯优化来微调这些模型。
    评估框架显示,每个模型都表现出不同的预测准确性和精度水平。具体来说,即使没有优化,RNN模型也优于其他架构。进行了全面的预测和透视分析,以仔细检查COVID-19数据集。
    这项研究通过提供重要见解,使阿联酋的公共卫生当局能够部署有针对性的数据驱动的干预措施,从而超越了学术界限。RNN模型,这被认为是最可靠和准确的具体背景,可以显著影响公共卫生决策。此外,这项研究的更广泛意义验证了深度学习技术处理复杂数据集的能力,从而为公共卫生和医疗保健部门的预测准确性提供了变革性的潜力。
    BACKGROUND: The coronavirus disease 2019 (COVID-19) pandemic continues to pose significant challenges to the public health sector, including that of the United Arab Emirates (UAE). The objective of this study was to assess the efficiency and accuracy of various deep-learning models in forecasting COVID-19 cases within the UAE, thereby aiding the nation\'s public health authorities in informed decision-making.
    METHODS: This study utilized a comprehensive dataset encompassing confirmed COVID-19 cases, demographic statistics, and socioeconomic indicators. Several advanced deep learning models, including long short-term memory (LSTM), bidirectional LSTM, convolutional neural network (CNN), CNN-LSTM, multilayer perceptron, and recurrent neural network (RNN) models, were trained and evaluated. Bayesian optimization was also implemented to fine-tune these models.
    RESULTS: The evaluation framework revealed that each model exhibited different levels of predictive accuracy and precision. Specifically, the RNN model outperformed the other architectures even without optimization. Comprehensive predictive and perspective analytics were conducted to scrutinize the COVID-19 dataset.
    CONCLUSIONS: This study transcends academic boundaries by offering critical insights that enable public health authorities in the UAE to deploy targeted data-driven interventions. The RNN model, which was identified as the most reliable and accurate for this specific context, can significantly influence public health decisions. Moreover, the broader implications of this research validate the capability of deep learning techniques in handling complex datasets, thus offering the transformative potential for predictive accuracy in the public health and healthcare sectors.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    背景:由于巨大的搜索空间,生物标志物的发现是一项具有挑战性的任务。量子计算和量子人工智能(量子AI)可用于解决从遗传数据中发现生物标志物的计算问题。
    方法:我们提出了一种量子神经网络架构来发现输入激活途径的遗传生物标志物。最大相关性-最小冗余标准评分生物标志物候选集。我们提出的模型是经济的,因为神经解决方案可以在受约束的硬件上交付。
    结果:我们证明了与CTLA4相关的四种激活途径的概念证明,包括(1)CTLA4激活独立,(2)CTLA4-CD8A-CD8B共激活,(3)CTLA4-CD2共激活,和(4)CTLA4-CD2-CD48-CD53-CD58-CD84共激活。
    结论:该模型表明与CLTA4相关途径的突变激活相关的新遗传生物标志物,包括20个基因:CLIC4,CPE,ETS2,FAM107A,GPR116,HYOU1,LCN2,MACF1,MT1G,NAPA,NDUFS5,PAK1,PFN1,PGAP3,PPM1G,PSMD8、RNF213、SLC25A3、UBA1和WLS。我们开源实现:https://github.com/namnguyen0510/Biomarker-Discovery-with-Quantum-Neural-Networks。
    BACKGROUND: Biomarker discovery is a challenging task due to the massive search space. Quantum computing and quantum Artificial Intelligence (quantum AI) can be used to address the computational problem of biomarker discovery from genetic data.
    METHODS: We propose a Quantum Neural Networks architecture to discover genetic biomarkers for input activation pathways. The Maximum Relevance-Minimum Redundancy criteria score biomarker candidate sets. Our proposed model is economical since the neural solution can be delivered on constrained hardware.
    RESULTS: We demonstrate the proof of concept on four activation pathways associated with CTLA4, including (1) CTLA4-activation stand-alone, (2) CTLA4-CD8A-CD8B co-activation, (3) CTLA4-CD2 co-activation, and (4) CTLA4-CD2-CD48-CD53-CD58-CD84 co-activation.
    CONCLUSIONS: The model indicates new genetic biomarkers associated with the mutational activation of CLTA4-associated pathways, including 20 genes: CLIC4, CPE, ETS2, FAM107A, GPR116, HYOU1, LCN2, MACF1, MT1G, NAPA, NDUFS5, PAK1, PFN1, PGAP3, PPM1G, PSMD8, RNF213, SLC25A3, UBA1, and WLS. We open source the implementation at: https://github.com/namnguyen0510/Biomarker-Discovery-with-Quantum-Neural-Networks .
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    Pauli通道是量子计算的基础,因为它们模拟了量子设备中最简单的噪声。我们提出了一种用于模拟Pauli通道的量子算法,并将其扩展为包含Pauli动态映射(参数化Pauli通道)。采用参数化的量子电路来适应动态映射。我们还建立了使用参数化电路可以实现N量子位变换的数学条件,其中只有一个单量子位操作取决于参数。在一个量子位的情况下,使用IBM的量子计算机演示了所提出的电路的实现,并报告此实现的保真度。
    Pauli channels are fundamental in the context of quantum computing as they model the simplest kind of noise in quantum devices. We propose a quantum algorithm for simulating Pauli channels and extend it to encompass Pauli dynamical maps (parametrized Pauli channels). A parametrized quantum circuit is employed to accommodate for dynamical maps. We also establish the mathematical conditions for an N-qubit transformation to be achievable using a parametrized circuit where only one single-qubit operation depends on the parameter. The implementation of the proposed circuit is demonstrated using IBM\'s quantum computers for the case of one qubit, and the fidelity of this implementation is reported.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Review
    生物信息学通过使用计算方法来分析和解释生物数据,彻底改变了生物学和医学。量子力学最近已成为分析生物系统的一种有前途的工具,导致量子生物信息学的发展。这个新领域采用了量子力学的原理,量子算法,和量子计算来解决分子生物学中的复杂问题,药物设计,和蛋白质折叠。然而,生物信息学的交叉点,生物学和量子力学提出了独特的挑战。一个重要的挑战是科学家之间的量子生物信息学和量子生物学之间的混淆的可能性,有相似的目标和概念。此外,每个领域的不同计算使得很难从可能影响生物过程的其他因素中确定边界和识别纯粹的量子效应。这篇综述概述了量子生物学和量子力学的概念及其在量子生物信息学中的交集。我们研究了该领域的挑战和独特特征,并提出了量子生物信息学的分类,以促进跨学科合作并加速进步。通过释放量子生物信息学的全部潜力,这篇综述旨在帮助我们理解生物系统中的量子力学。
    Bioinformatics has revolutionized biology and medicine by using computational methods to analyze and interpret biological data. Quantum mechanics has recently emerged as a promising tool for the analysis of biological systems, leading to the development of quantum bioinformatics. This new field employs the principles of quantum mechanics, quantum algorithms, and quantum computing to solve complex problems in molecular biology, drug design, and protein folding. However, the intersection of bioinformatics, biology, and quantum mechanics presents unique challenges. One significant challenge is the possibility of confusion among scientists between quantum bioinformatics and quantum biology, which have similar goals and concepts. Additionally, the diverse calculations in each field make it difficult to establish boundaries and identify purely quantum effects from other factors that may affect biological processes. This review provides an overview of the concepts of quantum biology and quantum mechanics and their intersection in quantum bioinformatics. We examine the challenges and unique features of this field and propose a classification of quantum bioinformatics to promote interdisciplinary collaboration and accelerate progress. By unlocking the full potential of quantum bioinformatics, this review aims to contribute to our understanding of quantum mechanics in biological systems.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    在过去的十年里,拓扑材料的发现开辟了凝聚态物理的新领域。这些材料以其独特的电子特性而闻名,不同于传统的绝缘体和金属。这一发现不仅激发了新的研究领域,而且为电子设备设计提供了创新的方法。现在,这些材料的一个关键方面是将它们转化为纳米结构增强了表面或边缘状态的存在,这些是其独特电子特性的关键组件。在这次审查中,我们专注于最近的合成方法,包括气-液-固(VLS)生长,化学气相沉积(CVD)和化学转化技术。此外,由于量子限制,拓扑纳米材料的缩小揭示了新的电子和磁性能。这篇综述涵盖了它们的合成方法以及拓扑纳米材料和应用的成果,包括量子计算,自旋电子学,和互连。最后,我们解决了在拓扑纳米材料在先进电子设备中的实际应用之前需要解决的材料和合成挑战。
    Over the last ten years, the discovery of topological materials has opened up new areas in condensed matter physics. These materials are noted for their distinctive electronic properties, unlike conventional insulators and metals. This discovery has not only spurred new research areas but also offered innovative approaches to electronic device design. A key aspect of these materials is now that transforming them into nanostructures enhances the presence of surface or edge states, which are the key components for their unique electronic properties. In this review, we focus on recent synthesis methods, including vapor-liquid-solid (VLS) growth, chemical vapor deposition (CVD), and chemical conversion techniques. Moreover, the scaling down of topological nanomaterials has revealed new electronic and magnetic properties due to quantum confinement. This review covers their synthesis methods and the outcomes of topological nanomaterials and applications, including quantum computing, spintronics, and interconnects. Finally, we address the materials and synthesis challenges that need to be resolved prior to the practical application of topological nanomaterials in advanced electronic devices.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    用几个方程模拟整个大脑动力学,生物物理,可以使用经验纤维束成像数据连接局部神经元种群的介观模型。神经群体的介观平均场模型的发展,特别是,自适应指数(AdEx均值场模型),成功地总结了神经元尺度现象,导致与有意识(异步和快速动力学)和无意识(同步慢波,具有上下状态动力学)大脑状态,基于细胞尺度的生物物理机制(例如,在睡眠-觉醒周期或麻醉剂期间对尖峰频率适应的神经调制调节)。使用虚拟大脑(TVB)环境连接平均场AdEx模型,我们以前已经模拟了大脑状态的一般性质,在尖峰频率自适应上播放,但是尚未对其他参数进行详细分析,这些参数可能也在调节不同大脑状态之间的大脑尺度动力学过渡。我们对TVB-AdEx模型进行了密集网格参数探索,利用高性能计算。我们报告了适应性诱导同步慢波活动的显着鲁棒性。此外,慢波的发生通常与功能和结构连通性之间更紧密的关系平行。我们发现超极化也可以产生类似无意识的同步上升和下降状态,这可能是麻醉剂作用的潜在机制。我们得出的结论是,TVB-AdEx模型揭示了在睡眠和麻醉中实验确定的大规模特性。
    To simulate whole brain dynamics with only a few equations, biophysical, mesoscopic models of local neuron populations can be connected using empirical tractography data. The development of mesoscopic mean-field models of neural populations, in particular, the Adaptive Exponential (AdEx mean-field model), has successfully summarized neuron-scale phenomena leading to the emergence of global brain dynamics associated with conscious (asynchronous and rapid dynamics) and unconscious (synchronized slow-waves, with Up-and-Down state dynamics) brain states, based on biophysical mechanisms operating at cellular scales (e.g. neuromodulatory regulation of spike-frequency adaptation during sleep-wake cycles or anesthetics). Using the Virtual Brain (TVB) environment to connect mean-field AdEx models, we have previously simulated the general properties of brain states, playing on spike-frequency adaptation, but have not yet performed detailed analyses of other parameters possibly also regulating transitions in brain-scale dynamics between different brain states. We performed a dense grid parameter exploration of the TVB-AdEx model, making use of High Performance Computing. We report a remarkable robustness of the effect of adaptation to induce synchronized slow-wave activity. Moreover, the occurrence of slow waves is often paralleled with a closer relation between functional and structural connectivity. We find that hyperpolarization can also generate unconscious-like synchronized Up and Down states, which may be a mechanism underlying the action of anesthetics. We conclude that the TVB-AdEx model reveals large-scale properties identified experimentally in sleep and anesthesia.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

  • 文章类型: Journal Article
    背景:基因组学和测序技术的进步需要更快,更具可扩展性的分析方法,该方法可以以更高的准确性处理更长的序列。然而,经典的成对对齐方法,基于动态规划(DP),强加不切实际的计算要求来比对长且嘈杂的序列,例如PacBio和Nanopore技术产生的序列。最近提出的波前对准(WFA)算法为更有效的对准工具铺平了道路。与以前的方法相比,提高了时间和内存复杂性。然而,高性能计算(HPC)平台需要高效的并行算法和工具来利用现代基于加速器的架构上可用的计算资源。
    结果:本文介绍了WFA-GPU,GPU(图形处理单元)加速工具,用于基于WFA算法计算精确的间隙仿射对齐。我们提出了算法自适应和性能优化,允许利用现代GPU设备的大规模并行功能来加速对齐计算。特别是,我们提出了一种CPU-GPU协同设计,能够执行序列间和序列内并行序列对齐,将简洁的WFA数据表示与高效的GPU实现相结合。因此,我们证明,在长而嘈杂的序列上使用启发式方法时,我们的实现比原始的多线程WFA实现高达4.3倍,高达18.2倍。与其他最先进的工具和库相比,WFA-GPU比其他GPU实现快29倍,比其他CPU实现快4个数量级。此外,WFA-GPU是唯一能够使用商用GPU正确对齐长读取的GPU解决方案。
    方法:WFA-GPU代码和文档可在https://github.com/quim0/WFA-GPU上公开获得。
    Advances in genomics and sequencing technologies demand faster and more scalable analysis methods that can process longer sequences with higher accuracy. However, classical pairwise alignment methods, based on dynamic programming (DP), impose impractical computational requirements to align long and noisy sequences like those produced by PacBio and Nanopore technologies. The recently proposed wavefront alignment (WFA) algorithm paves the way for more efficient alignment tools, improving time and memory complexity over previous methods. However, high-performance computing (HPC) platforms require efficient parallel algorithms and tools to exploit the computing resources available on modern accelerator-based architectures.
    This paper presents WFA-GPU, a GPU (graphics processing unit)-accelerated tool to compute exact gap-affine alignments based on the WFA algorithm. We present the algorithmic adaptations and performance optimizations that allow exploiting the massively parallel capabilities of modern GPU devices to accelerate the alignment computations. In particular, we propose a CPU-GPU co-design capable of performing inter-sequence and intra-sequence parallel sequence alignment, combining a succinct WFA-data representation with an efficient GPU implementation. As a result, we demonstrate that our implementation outperforms the original multi-threaded WFA implementation by up to 4.3× and up to 18.2× when using heuristic methods on long and noisy sequences. Compared to other state-of-the-art tools and libraries, the WFA-GPU is up to 29× faster than other GPU implementations and up to four orders of magnitude faster than other CPU implementations. Furthermore, WFA-GPU is the only GPU solution capable of correctly aligning long reads using a commodity GPU.
    WFA-GPU code and documentation are publicly available at https://github.com/quim0/WFA-GPU.
    导出

    更多引用

    收藏

    翻译标题摘要

    我要上传

       PDF(Pubmed)

公众号