Featured article of the week: April 10–16:"Elegancy: Digitizing the wisdom from laboratories to the cloud with free no-code platform"
One of the top priorities in any laboratory is archiving experimental data in the most secure, efficient, and errorless way. It is especially important to those in chemical and biological research, for it is more likely to damage experiment records. In addition, the transmission of experiment results from paper to electronic devices is time-consuming and redundant. Therefore, we introduce an open-source no-code electronic laboratory notebook (ELN), Elegancy, a cloud-based/standalone web service distributed as a Docker image. Elegancy fits all laboratories but is specially equipped with several features benefitting biochemical laboratories. It can be accessed via various web browsers, allowing researchers to upload photos or audio recordings directly from their mobile devices. Elegancy also contains a meeting arrangement module, audit/revision control, and laboratory supply management system. We believe Elegancy could help the scientific research community gather evidence, share information, reorganize knowledge, and digitize laboratory works with greater ease and security ... (Full article...)
Featured article of the week: April 03–09:
"Implementing an institution-wide electronic laboratory notebook initiative"
To strengthen institutional research data management practices, the Indiana University School of Medicine (IUSM) licensed an electronic laboratory notebook (ELN) to improve the organization, security, and shareability of information and data generated by the school’s researchers. The Ruth Lilly Medical Library led implementation on behalf of the IUSM’s Office of Research Affairs. This article describes the pilot and full-scale implementation of an ELN at IUSM. The initial pilot of the ELN in late 2018 involved 15 research labs, with access expanded in 2019 to all academic medical school constituents ... (Full article...)
|
Featured article of the week: March 27–April 02:
"Quality and environmental management systems as business tools to enhance ESG performance: A cross-regional empirical study"
The growing societal and political focus on sustainability at the global level is pressuring companies to enhance their environmental, social, and governance (ESG) performance to satisfy respective stakeholder needs and ensure sustained business success. With a data sample of 4,292 companies from Europe, East Asia, and North America, this work aims to prove through a cross-regional empirical study that quality management systems (QMSs) and environmental management systems (EMSs) represent powerful business tools to achieve this enhanced ESG performance. Descriptive and cluster analyses reveal that firms with QMSs and/or EMSs accomplish statistically significant higher ESG scores than companies without such management systems. Furthermore, the results indicate that operating both types of management systems simultaneously increases performance in the environmental and social pillar even further, while the governance dimension appears to be affected mainly by the adoption of EMSs alone ... (Full article...)
|
Featured article of the week: March 20–26:
"PIKAChU: A Python-based informatics kit for analyzing chemical units"
As efforts to computationally describe and simulate the biochemical world become more commonplace, computer programs that are capable of in silico chemistry play an increasingly important role in biochemical research. While such programs exist, they are often dependency-heavy, difficult to navigate, or not written in Python, the programming language of choice for bioinformaticians. Here, we introduce PIKAChU (Python-based Informatics Kit for Analysing CHemical Units), a cheminformatics toolbox with few dependencies implemented in Python. PIKAChU builds comprehensive molecular graphs from simplified molecular-input line-entry system (SMILES) strings, which allow for easy downstream analysis and visualization of molecules. ... (Full article...)
|
Featured article of the week: March 13–19:
"Development of Biosearch System for biobank management and storage of disease-associated genetic information"
Databases and software are important to manage modern high-throughput laboratories and store clinical and genomic information for quality assurance. Commercial software is expensive, with proprietary code issues, while academic versions have adaptation issues. Our aim was to develop an adaptable in-house software system that can store specimen- and disease-associated genetic information in biobanks to facilitate translational research. A prototype was designed per the research requirements, and computational tools were used to develop the software under three tiers, using Visual Basic and ASP.net for the presentation tier, SQL Server for the data tier, and Ajax and JavaScript for the business tier. We retrieved specimens from the biobank using this software and performed microarray-based transcriptomic analysis to detect differentially expressed genes (DEGs) ... (Full article...)
|
Featured article of the week: March 06–12:
"Establishing a common nutritional vocabulary: From food production to diet"
Informed policy and decision-making for food systems, nutritional security, and global health would benefit from standardization and comparison of food composition data, spanning production to consumption. To address this challenge, we present a formal controlled vocabulary of terms, definitions, and relationships within the Compositional Dietary Nutrition Ontology (CDNO) that enables description of nutritional attributes for material entities contributing to the human diet. We demonstrate how ongoing community development of CDNO classes can harmonize trans-disciplinary approaches for describing nutritional components from food production to diet ... (Full article...)
|
Featured article of the week: February 28–March 05:
"Designing a knowledge management system for naval materials failures"
Implemented materials fail from time to time, requiring failure analysis. This type of scientific analysis expands into forensic engineering for it aims not only to identify individual and symptomatic reasons for failure, but also to assess and understand repetitive failure patterns, which could be related to underlying material faults, design mistakes, or maintenance omissions. Significant information can be gained and studied from carefully documenting and managing the data that comes from failure analysis of materials, including in the naval industry. The NAVMAT research project, presented herein, attempts an interdisciplinary approach to materials informatics by integrating materials engineering and informatics under a platform of knowledge management. Our approach utilizes a focused, common-cause failure analysis methodology for the naval and marine environment. The platform's design is dedicated to the effective recording, efficient indexing, and easy and accurate retrieval of relevant information, including the associated history of maintenance and secure operation concerning failure incidents of marine materials, components, and systems in an organizational fleet. ... (Full article...)
|
Featured article of the week: February 20–27:
"Laboratory information management system for COVID-19 non-clinical efficacy trial data"
As the number of large-scale research studies involving multiple organizations producing data has steadily increased, an integrated system for a common interoperable data format is needed. For example, in response to the coronavirus disease 2019 (COVID-19) pandemic, a number of global efforts are underway to develop vaccines and therapeutics. We are therefore observing an explosion in the proliferation of COVID-19 data, and interoperability is highly requested in multiple institutions participating simultaneously in COVID-19 pandemic research. In this study, a laboratory information management system (LIMS) has been adopted to systemically manage, via web interface, various COVID-19 non-clinical trial data—including mortality, clinical signs, body weight, body temperature, organ weights, viral titer (viral replication and viral RNA), and multi-organ histopathology—from multiple institutions ... (Full article...)
|
Featured article of the week: February 13–19:
"Improving data quality in clinical research informatics tools"
Maintaining data quality is a fundamental requirement for any successful and long-term data management project. Providing high-quality, reliable, and statistically sound data is a primary goal for clinical research informatics. In addition, effective data governance and management are essential to ensuring accurate data counts, reports, and validation. As a crucial step of the clinical research process, it is important to establish and maintain organization-wide standards for data quality management to ensure consistency across all systems designed primarily for cohort identification ... (Full article...)
|
Featured article of the week: February 6–12:
"Electronic tools in clinical laboratory diagnostics: Key examples, limitations, and value in laboratory medicine"
Electronic tools in clinical laboratory diagnostics can assist laboratory professionals, clinicians, and patients in medical diagnostic management and laboratory test interpretation. With increasing implementation of electronic health records (EHRs) and laboratory information systems (LIS) worldwide, there is increasing demand for well-designed and evidence-based electronic resources. Both complex data-driven and simple interpretative electronic healthcare tools are currently available to improve the integration of clinical and laboratory information towards a more patient-centered approach to medicine. Several studies have reported positive clinical impact of electronic healthcare tool implementation in clinical laboratory diagnostics, including in the management of neonatal bilirubinemia, cardiac disease, and nutritional status ... (Full article...)
|
Featured article of the week: January 30–February 5:
"Anatomic pathology quality assurance: Developing an LIS-based tracking and documentation module for intradepartmental consultations"
An electronic intradepartmental consultation system for anatomic pathology (AP) was conceived and developed in the laboratory information system (LIS) of University of Iowa Hospitals and Clinics in 2019. Previously, all surgical pathology intradepartmental consultative activities were initiated and documented with paper forms, which were circulated with the pertinent microscopic slides and were eventually filed. In this study, we discuss the implementation and utilization of an electronic intradepartmental AP consultation system. Workflows and procedures were developed to organize intradepartmental surgical pathology consultations from the beginning to the end point of the consultative activities entirely using a paperless system that resided in the LIS ... (Full article...)
|
Featured article of the week: January 23–29:
"Using knowledge graph structures for semantic interoperability in electronic health records data exchanges"
Information sharing across medical institutions is restricted to information exchange between specific partners. The lifelong electronic health record (EHR) structure and content require standardization efforts. Existing standards such as openEHR, Health Level 7 (HL7), and ISO/EN 13606 aim to achieve data independence along with semantic interoperability. This study aims to discover knowledge representation to achieve semantic health data exchange. openEHR and ISO/EN 13606 use archetype-based technology for semantic interoperability. The HL7 Clinical Document Architecture is on its way to adopting this through HL7 templates. Archetypes are the basis for knowledge-based systems, as these are means to define clinical knowledge. The paper examines a set of formalisms for the suitability of describing, representing, and reasoning about archetypes ... (Full article...)
|
Featured article of the week: January 16–22:
"CustodyBlock: A distributed chain of custody evidence framework"
With the increasing number of cybercrimes, the digital forensics team has no choice but to implement more robust and resilient evidence-handling mechanisms. The capturing of digital evidence, which is a tangible and probative piece of information that can be presented in court and used in trial, is challenging due to its volatility and the possible effects of improper handling procedures. When computer systems get compromised, digital forensics comes into play to analyze, discover, extract, and preserve all relevant evidence. Therefore, it is imperative to maintain efficient evidence management to guarantee the credibility and admissibility of digital evidence in a court of law. A critical component of this process is to utilize an adequate chain of custody (CoC) approach to preserve the evidence in its original state from compromise and/or contamination ... (Full article...)
|
Featured article of the week: January 09–15:
"Development and governance of FAIR thresholds for a data federation"
The FAIR (findable, accessible, interoperable, and re-usable) principles and practice recommendations provide high-level guidance and recommendations that are not research-domain specific in nature. There remains a gap in practice at the data provider and domain scientist level, demonstrating how the FAIR principles can be applied beyond a set of generalist guidelines to meet the needs of a specific domain community. We present our insights developing FAIR thresholds in a domain-specific context for self-governance by a community (in this case, agricultural research). "Minimum thresholds" for FAIR data are required to align expectations for data delivered from providers’ distributed data stores through a community-governed federation (the Agricultural Research Federation, AgReFed) ... (Full article...)
|
Featured article of the week: January 02–08:
"Design of a data management reference architecture for sustainable agriculture"
Effective and efficient data management is crucial for smart farming and precision agriculture. To realize operational efficiency, full automation, and high productivity in agricultural systems, different kinds of data are collected from operational systems using different sensors, stored in different systems, and processed using advanced techniques, such as machine learning and deep learning. Due to the complexity of data management operations, a data management reference architecture is required. While there are different initiatives to design data management reference architectures, a data management reference architecture for sustainable agriculture is missing. In this study, we follow domain scoping, domain modeling, and reference architecture design stages to design the reference architecture for sustainable agriculture. Four case studies were performed to demonstrate the applicability of the reference architecture. This study shows that the proposed data management reference architecture is practical and effective for sustainable agriculture ... (Full article...)
|
|