Journal:Wrangling environmental exposure data: Guidance for getting the best information from your laboratory measurements

From LIMSWiki
Revision as of 20:20, 13 April 2020 by Shawndouglas (talk | contribs) (Saving and adding more.)
Jump to navigationJump to search
Full article title Wrangling environmental exposure data: Guidance for getting the best information from your laboratory measurements
Journal Environmental Health
Author(s) Udesky, Julia O.; Dodson, Robin E.; Perovich, Laura J.; Rudel, Ruthann A.
Author affiliation(s) Silent Spring Institute
Primary contact Email: Use journal website to contact
Year published 2019
Volume and issue 18
Article # 99
DOI 10.1186/s12940-019-0537-8
ISSN 1476-069X
Distribution license Creative Commons Attribution 4.0 International
Website https://ehjournal.biomedcentral.com/articles/10.1186/s12940-019-0537-8
Download https://ehjournal.biomedcentral.com/track/pdf/10.1186/s12940-019-0537-8 (PDF)

Abstract

Background: Environmental health and exposure researchers can improve the quality and interpretation of their chemical measurement data, avoid spurious results, and improve analytical protocols for new chemicals by closely examining lab and field quality control (QC) data. Reporting QC data along with chemical measurements in biological and environmental samples allows readers to evaluate data quality and appropriate uses of the data (e.g., for comparison to other exposure studies, association with health outcomes, use in regulatory decision-making). However many studies do not adequately describe or interpret QC assessments in publications, leaving readers uncertain about the level of confidence in the reported data. One potential barrier to both QC implementation and reporting is that guidance on how to integrate and interpret QC assessments is often fragmented and difficult to find, with no centralized repository or summary. In addition, existing documents are typically written for regulatory scientists rather than environmental health researchers, who may have little or no experience in analytical chemistry.

Objectives: We discuss approaches for implementing quality assurance/quality control (QA/QC) in environmental exposure measurement projects and describe our process for interpreting QC results and drawing conclusions about data validity.

Discussion: Our methods build upon existing guidance and years of practical experience collecting exposure data and analyzing it in collaboration with contract and university laboratories, as well as the Centers for Disease Control and Prevention. With real examples from our data, we demonstrate problems that would not have come to light had we not engaged with our QC data and incorporated field QC samples in our study design. Our approach focuses on descriptive analyses and data visualizations that have been compatible with diverse exposure studies, with sample sizes ranging from tens to hundreds of samples. Future work could incorporate additional statistically grounded methods for larger datasets with more QC samples.

Conclusions: This guidance, along with example table shells, graphics, and some sample R code, provides a useful set of tools for getting the best information from valuable environmental exposure datasets and enabling valid comparison and synthesis of exposure data across studies.

Keywords: exposure science, environmental epidemiology, environmental chemicals, environmental monitoring, quality assurance/quality control (QA/QC), data validation, exposure measurement, measurement error

Background

Chemical measurements play a critical role in the study of links between the environment and health, yet many researchers in this field receive little if any training in analytical chemistry. The growing interest in measuring and evaluating health effects of co-exposure to a multitude of chemicals[1][2] makes this gap in training increasingly problematic, as the task at hand becomes ever-more complicated (i.e., analyzing for more and for new chemicals of concern). If steps are not taken throughout samples collection and analysis to minimize and characterize likely sources of measurement error, the impact on the interpretation of these valuable measurements can vary along the spectrum from false negative to false positive, as we will illustrate with real examples from our own data.

Some important considerations when measuring and interpreting environmental chemical exposures have been discussed in other peer-reviewed articles or official guidance documents. For example, a recent document from the Environmental Protection Agency (EPA) provides citizen scientists with guidance on how to develop a field measurement program, including planning for the collection of quality control (QC) samples.[3] The Centers for Disease Control and Prevention (CDC) also gives guidance related to collection, storage, and shipment of biological samples for analysis of environmental chemicals or nutritional factors.[4] To assess the quality of already-collected data, LaKind et al. (2014) developed a tool to evaluate epidemiologic studies that use biomonitoring data on short-lived chemicals, with a focus on critical elements of study design such as choice of analytical and sampling methods.[5] The tool was recently incorporated into “ExpoQual,” a framework for assessing suitability of both measured and modeled exposure data for a given use (“fit-for-purpose”).[6] Other useful guidance has been published, for example on automated quality assurance/quality control (QA/QC) processes for sensors collecting continuous streams of environmental data[7] and for establishing an overall data management plan, including documentation of metadata and strategies for data storage.[8]

Despite these helpful documents, there is still a lack of readily accessible, practical guidance on how to interpret and use the results of both field and laboratory QC checks to qualify exposure datasets (i.e., flag results for certain compounds or certain samples that are imprecise, estimated, or potentially over- or under-reported), and this gap is reflected in the environmental health literature. While the vast majority of environmental health studies report robust findings based on high-quality measurements, questions about measure validity have led to confusion and lack of confidence in some topic areas. For example, a number of studies have measured rapidly metabolized chemicals such as phthalates and bisphenol A (BPA) in blood or other non-urine matrices, despite the fact that urine is the preferred matrix for these chemicals. Phthalates and BPA are present at higher levels in urine and, when the proper metabolites are measured, there is less concern about contamination from external sources, including contamination from plastics during specimen collection.[9]

More commonly, however, exposure studies simply do not adequately report on QA/QC or describe how QC results informed reporting and interpretation of the data. In the context of systematic review and weight of evidence approaches, not reporting on QA/QC may result in a study being given less weight. For example, the risk of bias tool employed in case studies of the Navigation Guide method for systematic review includes reporting of certain QA/QC results in its criteria for a “low risk of bias” rating (e.g., reference[10]). When we applied the Navigation Guide's QA/QC criterion to 30 studies of biological or environmental measurements that we included in a recent review of environmental exposures and breast cancer[11], we found that more than half either did not report QA/QC details that were required for a “low risk of bias” assessment, or if they did report QA/QC, they did not interpret or use them adequately to inform the analysis (e.g., reported poor precision but did not discuss how/whether this could affect findings) (see Additional file 1 for details). Similarly, when LaKind et al. applied their study quality assessment tool to epidemiologic literature on BPA and neurodevelopmental and respiratory health, they found that QA/QC issues related to contamination and analyte stability were not well-reported.[12] Of note, several of the studies in our breast cancer review that did not provide adequate QA/QC information had their samples analyzed at the CDC Environmental Health Laboratory. It is helpful to include summaries of QA/QC assessments in published work, even if researchers are using a well-established lab, because this provides a useful standard for comparing QA/QC in other studies.


References

  1. Wild, C.P. (2005). "Complementing the genome with an "exposome": the outstanding challenge of environmental exposure measurement in molecular epidemiology". Cancer Epidemiology, Biomarkers & Preventions 14 (8): 1847–50. doi:10.1158/1055-9965.EPI-05-0456. PMID 16103423. 
  2. Carlin, D.J.; Rider, C.V.; Woychik, R. et al. (2013). "Unraveling the health effects of environmental mixtures: An NIEHS priority". Environmental Health Perspectives 121 (1): A6–8. doi:10.1289/ehp.1206182. PMC PMC3553446. PMID 23409283. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3553446. 
  3. U.S. Environmental Protection Agency (March 2019). "Quality Assurance Handbook and Guidance Documents for Citizen Science Projects". U.S. Environmental Protection Agency. https://www.epa.gov/citizen-science/quality-assurance-handbook-and-guidance-documents-citizen-science-projects. Retrieved 28 May 2019. 
  4. Centers for Disease Control and Preventions (March 2018). "Improving the Collection and Management of Human Samples Used for Measuring Environmental Chemicals and Nutrition Indicators" (PDF). Centers for Disease Control and Prevention. https://www.cdc.gov/biomonitoring/pdf/Human_Sample_Collection-508.pdf. Retrieved 28 May 2019. 
  5. LaKind, J.S.; Sobus, J.R.; Goodman, M. et al. (2014). "A proposal for assessing study quality: Biomonitoring, Environmental Epidemiology, and Short-lived Chemicals (BEES-C) instrument". Environment International 73: 195–207. doi:10.1016/j.envint.2014.07.011. PMC PMC4310547. PMID 25137624. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4310547. 
  6. LaKind, J.S.; O'Mahony, C.; Armstrong, T. et al. (2019). "ExpoQual: Evaluating measured and modeled human exposure data". Environmental Research 171: 302-312. doi:10.1016/j.envres.2019.01.039. PMID 30708234. 
  7. Campbell, J.L.; Rustad, L.E.; Porter, J.H. et al. (2013). "Quantity is Nothing without Quality: Automated QA/QC for Streaming Environmental Sensor Data". BioScience 63 (7): 574–585. doi:10.1525/bio.2013.63.7.10. 
  8. Michener, W.K. (2015). "Ten Simple Rules for Creating a Good Data Management Plan". PLoS Computational Biology 11 (10): e1004525. doi:10.1371/journal.pcbi.1004525. PMC PMC4619636. PMID 26492633. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4619636. 
  9. Calafat, A.M.; Longnecker, M.P.; Koch, H.M. et al. (2015). "Optimal Exposure Biomarkers for Nonpersistent Chemicals in Environmental Epidemiology". Environmental Health Perspectives 123 (7): A166–8. doi:10.1289/ehp.1510041. PMC PMC4492274. PMID 26132373. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4492274. 
  10. Lam, J.; Koustas, E.; Sutton, P. et al. (2014). "The Navigation Guide - Evidence-based medicine meets environmental health: Integration of animal and human evidence for PFOA effects on fetal growth". Environmental Health Perspectives 122 (10): 1040-51. doi:10.1289/ehp.1307923. PMC PMC4181930. PMID 24968389. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4181930. 
  11. Rodgers, K.M.; Udesky, J.O.; Rudel, R.A. et al. (2018). "Environmental chemicals and breast cancer: An updated review of epidemiological literature informed by biological mechanisms". Environmental Research 160: 152–82. doi:10.1016/j.envres.2017.08.045. PMID 28987728. 
  12. LaKind, J.S.; Goodman, M.; Barr, D.B. et al. (2015). "Lessons learned from the application of BEES-C: Systematic assessment of study quality of epidemiologic research on BPA, neurodevelopment, and respiratory health". Environment International 80: 41–71. doi:10.1016/j.envint.2015.03.015. PMID 25884849. 

Notes

This presentation is faithful to the original, with only a few minor changes to presentation, spelling, and grammar. We also added PMCID and DOI when they were missing from the original reference.