Difference between revisions of "Template:Article of the week"

From LIMSWiki
Jump to navigationJump to search
(Updated article of the week text)
(Updated article of the week text)
(133 intermediate revisions by the same user not shown)
Line 1: Line 1:
<!--<div style="float: left; margin: 0.5em 0.9em 0.4em 0em;">[[File:Fig9 Brown JMIRMedInfo2020 8-9.png|240px]]</div>//-->
<div style="float: left; margin: 0.5em 0.9em 0.4em 0em;">[[File:Fig1 Bispo-Silva Geosciences23 13-11.png|240px]]</div>
'''"[[Journal:Explainability for artificial intelligence in healthcare: A multidisciplinary perspective|Explainability for artificial intelligence in healthcare: A multidisciplinary perspective]]"'''
'''"[[Journal:Geochemical biodegraded oil classification using a machine learning approach|Geochemical biodegraded oil classification using a machine learning approach]]"'''


Explainability is one of the most heavily debated topics when it comes to the application of [[artificial intelligence]] (AI) in healthcare. Even though AI-driven systems have been shown to outperform humans in certain analytical tasks, the lack of explainability continues to spark criticism. Yet, explainability is not a purely technological issue; instead, it invokes a host of medical, legal, ethical, and societal questions that require thorough exploration. This paper provides a comprehensive assessment of the role of explainability in medical AI and makes an ethical evaluation of what explainability means for the adoption of AI-driven tools into clinical practice. Taking AI-based [[clinical decision support system]]s as a case in point, we adopted a multidisciplinary approach to analyze the relevance of explainability for medical AI from the technological, legal, medical, and patient perspectives. Drawing on the findings of this conceptual analysis, we then conducted an ethical assessment using Beauchamp and Childress' ''Principles of Biomedical Ethics'' (autonomy, beneficence, nonmaleficence, and justice) as an analytical framework to determine the need for explainability in medical AI. ('''[[Journal:Explainability for artificial intelligence in healthcare: A multidisciplinary perspective|Full article...]]''')<br />
[[Chromatography|Chromatographic]] oil analysis is an important step for the identification of biodegraded petroleum via peak visualization and interpretation of phenomena that explain the oil geochemistry. However, analyses of chromatogram components by geochemists are comparative, visual, and consequently slow. This article aims to improve the chromatogram analysis process performed during geochemical interpretation by proposing the use of [[convolutional neural network]]s (CNN), which are deep learning techniques widely used by big tech companies. Two hundred and twenty-one (221) chromatographic oil images from different worldwide basins (Brazil, USA, Portugal, Angola, and Venezuela) were used. The [[open-source software]] Orange Data Mining was used to process images by CNN. The CNN algorithm extracts, pixel by pixel, recurring features from the images through convolutional operations ... ('''[[Journal:Geochemical biodegraded oil classification using a machine learning approach|Full article...]]''')<br />
<br />
''Recently featured'':
''Recently featured'':
{{flowlist |
{{flowlist |
* [[Journal:Secure record linkage of large health data sets: Evaluation of a hybrid cloud model|Secure record linkage of large health data sets: Evaluation of a hybrid cloud model]]
* [[Journal:Knowledge of internal quality control for laboratory tests among laboratory personnel working in a biochemistry department of a tertiary care center: A descriptive cross-sectional study|Knowledge of internal quality control for laboratory tests among laboratory personnel working in a biochemistry department of a tertiary care center: A descriptive cross-sectional study]]
* [[Journal:Risk assessment for scientific data|Risk assessment for scientific data]]
* [[Journal:Sigma metrics as a valuable tool for effective analytical performance and quality control planning in the clinical laboratory: A retrospective study|Sigma metrics as a valuable tool for effective analytical performance and quality control planning in the clinical laboratory: A retrospective study]]
* [[Journal:Methods for quantification of cannabinoids: A narrative review|Methods for quantification of cannabinoids: A narrative review]]
* [[Journal:Why do we need food systems informatics? Introduction to this special collection on smart and connected regional food systems|Why do we need food systems informatics? Introduction to this special collection on smart and connected regional food systems]]
}}
}}

Revision as of 13:37, 13 May 2024

Fig1 Bispo-Silva Geosciences23 13-11.png

"Geochemical biodegraded oil classification using a machine learning approach"

Chromatographic oil analysis is an important step for the identification of biodegraded petroleum via peak visualization and interpretation of phenomena that explain the oil geochemistry. However, analyses of chromatogram components by geochemists are comparative, visual, and consequently slow. This article aims to improve the chromatogram analysis process performed during geochemical interpretation by proposing the use of convolutional neural networks (CNN), which are deep learning techniques widely used by big tech companies. Two hundred and twenty-one (221) chromatographic oil images from different worldwide basins (Brazil, USA, Portugal, Angola, and Venezuela) were used. The open-source software Orange Data Mining was used to process images by CNN. The CNN algorithm extracts, pixel by pixel, recurring features from the images through convolutional operations ... (Full article...)
Recently featured: