Difference between revisions of "User:Shawndouglas/sandbox/sublevel5"

From LIMSWiki
Jump to navigationJump to search
 
(338 intermediate revisions by the same user not shown)
Line 3: Line 3:
| type      = notice
| type      = notice
| style    = width: 960px;
| style    = width: 960px;
| text      = This is sublevel2 of my sandbox, where I play with features and test MediaWiki code. If you wish to leave a comment for me, please see [[User_talk:Shawndouglas|my discussion page]] instead.<p></p>
| text      = This is sublevel5 of my sandbox, where I play with features and test MediaWiki code. If you wish to leave a comment for me, please see [[User_talk:Shawndouglas|my discussion page]] instead.<p></p>
}}
}}


==Sandbox begins below==
==Sandbox begins below==
{{Infobox journal article
{{raw:wikipedia::Detection limit}}
|name        =
|image        =
|alt          = <!-- Alternative text for images -->
|caption      =
|title_full  = Big data management for healthcare systems: Architecture, requirements, and implementation
|journal      = ''Advances in Bioinformatics''
|authors      = El aboudi, Naoual; Benhilma, Laila
|affiliations = Mohammed V University
|contact      = Email: nawal dot elaboudi at gmail dot com
|editors      = Fdez-Riverola, Florentino
|pub_year    = 2018
|vol_iss      = '''2018'''
|pages        = 4059018
|doi          = [http://10.1155/2018/4059018 10.1155/2018/4059018]
|issn        = 1687-8035
|license      = [http://creativecommons.org/licenses/by/4.0/ Creative Commons Attribution 4.0 International]
|website      = [https://www.hindawi.com/journals/abi/2018/4059018/ https://www.hindawi.com/journals/abi/2018/4059018/]
|download    = [http://downloads.hindawi.com/journals/abi/2018/4059018.pdf http://downloads.hindawi.com/journals/abi/2018/4059018.pdf] (PDF)
}}
{{ombox
| type      = content
| style    = width: 500px;
| text      = This article should not be considered complete until this message box has been removed. This is a work in progress.
}}
==Abstract==
The growing amount of data in the healthcare industry has made inevitable the adoption of big data techniques in order to improve the quality of healthcare delivery. Despite the integration of big data processing approaches and platforms in existing [[Information management|data management]] architectures for healthcare systems, these architectures face difficulties in preventing emergency cases. The main contribution of this paper is proposing an extensible big data architecture based on both stream computing and batch computing in order to enhance further the reliability of healthcare systems by generating real-time alerts and making accurate predictions on patient health condition. Based on the proposed architecture, a prototype implementation has been built for healthcare systems in order to generate real-time alerts. The suggested prototype is based on Spark and MongoDB tools.
 
==Introduction==
The proportion of elderly people in society is growing worldwide<ref name="WHOGlobal11">{{cite web |url=http://www.who.int/ageing/publications/global_health/en/ |title=Global Health and Aging |editor=World Health Organization; National Institute of Aging |publisher=WHO |date=October 2011}}</ref>; this phenomenon—referred to by the World Health Organization as "humanity’s aging"<ref name="WHOGlobal11" />—has many implications on healthcare services, especially in terms of cost. In the face of such a situation, relying on classical systems may result in a life quality decline for millions of people. Seeking to overcome this problem, a variety of different healthcare systems have been designed. Their common principle is transferring, on a periodical basis, medical parameters like blood pressure, heart rate, glucose level, body temperature, and ECG signals to an automated system aimed at monitoring in real time patients' health condition. Such systems provide quick assistance when needed since data is analyzed continuously. Automating health monitoring favors a proactive approach that relieves medical facilities by saving costs related to [[Hospital|hospitalization]], and it also enhances healthcare services by improving waiting time for consultations. Recently, the number of data sources in the healthcare industry has grown rapidly as a result of widespread use of mobile and wearable sensor technologies, which have flooded the healthcare arena with a huge amount of data. Therefore, it becomes challenging to perform healthcare [[data analysis]] based on traditional methods which are unfit to handle the high volume of diversified medical data. In general, the healthcare domain has four categories of analytics: descriptive, diagnostic, predictive, and prescriptive analytics. A brief description of each one of them is given below.
 
'''Descriptive analytics''' refers to describing current situations and reporting on them. Several techniques are employed to perform this level of analytics. For instance, descriptive statistics tools like histograms and charts are among the techniques used in descriptive analytics.
 
'''Diagnostic analysis''' aims to explain why certain events occurred and what the factors that triggered them are. For example, diagnostic analysis attempts to understand the reasons behind the regular readmission of some patients by using several methods such as clustering and decision trees.
 
'''Predictive analytics''' reflects the ability to predict future events; it also helps in identifying trends and determining probabilities of uncertain outcomes. An illustration of its role is to predict whether or not a patient will have complications. Predictive models are often built using machine learning techniques.
 
'''Prescriptive analytics''' proposes suitable actions leading to optimal decision-making. For instance, prescriptive analysis may suggest rejecting a given treatment in the case of a harmful side effect's high probability. Decision trees and Monte Carlo simulation are examples of methods applied to perform prescriptive analytics. Figure 1 illustrates analytics phases for the healthcare domain.<ref name="GandomiBeyond15">{{cite journal |title=Beyond the hype: Big data concepts, methods, and analytics |journal=International Journal of Information Management |author=Gandomi, A.; Haider, M. |volume=35 |issue=2 |pages=137–44 |year=2015 |doi=10.1016/j.ijinfomgt.2014.10.007}}</ref> The integration of big data technologies into healthcare analytics may lead to better performance of medical systems.
 
==References==
{{Reflist|colwidth=30em}}
 
==Notes==
This presentation is faithful to the original, with only a few minor changes to presentation. Grammar was cleaned up for smoother reading. In some cases important information was missing from the references, and that information was added.
 
<!--Place all category tags here-->
[[Category:LIMSwiki journal articles (added in 2018)‎]]
[[Category:LIMSwiki journal articles (all)‎]]
[[Category:LIMSwiki journal articles on big data]]
[[Category:LIMSwiki journal articles on data management and sharing]]
[[Category:LIMSwiki journal articles on health informatics]]
[[Category:LIMSwiki journal articles on information technology]]

Latest revision as of 18:25, 10 January 2024

Sandbox begins below

Template:Short description

The limit of detection (LOD or LoD) is the lowest signal, or the lowest corresponding quantity to be determined (or extracted) from the signal, that can be observed with a sufficient degree of confidence or statistical significance. However, the exact threshold (level of decision) used to decide when a signal significantly emerges above the continuously fluctuating background noise remains arbitrary and is a matter of policy and often of debate among scientists, statisticians and regulators depending on the stakes in different fields.

Significance in analytical chemistry

In analytical chemistry, the detection limit, lower limit of detection, also termed LOD for limit of detection or analytical sensitivity (not to be confused with statistical sensitivity), is the lowest quantity of a substance that can be distinguished from the absence of that substance (a blank value) with a stated confidence level (generally 99%).[1][2][3] The detection limit is estimated from the mean of the blank, the standard deviation of the blank, the slope (analytical sensitivity) of the calibration plot and a defined confidence factor (e.g. 3.2 being the most accepted value for this arbitrary value).[4] Another consideration that affects the detection limit is the adequacy and the accuracy of the model used to predict concentration from the raw analytical signal.[5]

As a typical example, from a calibration plot following a linear equation taken here as the simplest possible model:

where, corresponds to the signal measured (e.g. voltage, luminescence, energy, etc.), "Template:Mvar" the value in which the straight line cuts the ordinates axis, "Template:Mvar" the sensitivity of the system (i.e., the slope of the line, or the function relating the measured signal to the quantity to be determined) and "Template:Mvar" the value of the quantity (e.g. temperature, concentration, pH, etc.) to be determined from the signal ,[6] the LOD for "Template:Mvar" is calculated as the "Template:Mvar" value in which equals to the average value of blanks "Template:Mvar" plus "Template:Mvar" times its standard deviation "Template:Mvar" (or, if zero, the standard deviation corresponding to the lowest value measured) where "Template:Mvar" is the chosen confidence value (e.g. for a confidence of 95% it can be considered Template:Mvar = 3.2, determined from the limit of blank).[4]

Thus, in this didactic example:

There are a number of concepts derived from the detection limit that are commonly used. These include the instrument detection limit (IDL), the method detection limit (MDL), the practical quantitation limit (PQL), and the limit of quantitation (LOQ). Even when the same terminology is used, there can be differences in the LOD according to nuances of what definition is used and what type of noise contributes to the measurement and calibration.[7]

The figure below illustrates the relationship between the blank, the limit of detection (LOD), and the limit of quantitation (LOQ) by showing the probability density function for normally distributed measurements at the blank, at the LOD defined as 3 × standard deviation of the blank, and at the LOQ defined as 10 × standard deviation of the blank. (The identical spread along Abscissa of these two functions is problematic.) For a signal at the LOD, the alpha error (probability of false positive) is small (1%). However, the beta error (probability of a false negative) is 50% for a sample that has a concentration at the LOD (red line). This means a sample could contain an impurity at the LOD, but there is a 50% chance that a measurement would give a result less than the LOD. At the LOQ (blue line), there is minimal chance of a false negative.

Template:Wide image

Instrument detection limit

Most analytical instruments produce a signal even when a blank (matrix without analyte) is analyzed. This signal is referred to as the noise level. The instrument detection limit (IDL) is the analyte concentration that is required to produce a signal greater than three times the standard deviation of the noise level. This may be practically measured by analyzing 8 or more standards at the estimated IDL then calculating the standard deviation from the measured concentrations of those standards.

The detection limit (according to IUPAC) is the smallest concentration, or the smallest absolute amount, of analyte that has a signal statistically significantly larger than the signal arising from the repeated measurements of a reagent blank.

Mathematically, the analyte's signal at the detection limit () is given by:

where, is the mean value of the signal for a reagent blank measured multiple times, and is the known standard deviation for the reagent blank's signal.

Other approaches for defining the detection limit have also been developed. In atomic absorption spectrometry usually the detection limit is determined for a certain element by analyzing a diluted solution of this element and recording the corresponding absorbance at a given wavelength. The measurement is repeated 10 times. The 3σ of the recorded absorbance signal can be considered as the detection limit for the specific element under the experimental conditions: selected wavelength, type of flame or graphite oven, chemical matrix, presence of interfering substances, instrument... .

Method detection limit

Often there is more to the analytical method than just performing a reaction or submitting the analyte to direct analysis. Many analytical methods developed in the laboratory, especially these involving the use of a delicate scientific instrument, require a sample preparation, or a pretreatment of the samples prior to being analysed. For example, it might be necessary to heat a sample that is to be analyzed for a particular metal with the addition of acid first (digestion process). The sample may also be diluted or concentrated prior to analysis by means of a given instrument. Additional steps in an analysis method add additional opportunities for errors. Since detection limits are defined in terms of errors, this will naturally increase the measured detection limit. This "global" detection limit (including all the steps of the analysis method) is called the method detection limit (MDL). The practical way for determining the MDL is to analyze seven samples of concentration near the expected limit of detection. The standard deviation is then determined. The one-sided Student's t-distribution is determined and multiplied versus the determined standard deviation. For seven samples (with six degrees of freedom) the t value for a 99% confidence level is 3.14. Rather than performing the complete analysis of seven identical samples, if the Instrument Detection Limit is known, the MDL may be estimated by multiplying the Instrument Detection Limit, or Lower Level of Detection, by the dilution prior to analyzing the sample solution with the instrument. This estimation, however, ignores any uncertainty that arises from performing the sample preparation and will therefore probably underestimate the true MDL.

Limit of each model

The issue of limit of detection, or limit of quantification, is encountered in all scientific disciplines. This explains the variety of definitions and the diversity of juridiction specific solutions developed to address preferences. In the simplest cases as in nuclear and chemical measurements, definitions and approaches have probably received the clearer and the simplest solutions. In biochemical tests and in biological experiments depending on many more intricate factors, the situation involving false positive and false negative responses is more delicate to handle. In many other disciplines such as geochemistry, seismology, astronomy, dendrochronology, climatology, life sciences in general, and in many other fields impossible to enumerate extensively, the problem is wider and deals with signal extraction out of a background of noise. It involves complex statistical analysis procedures and therefore it also depends on the models used,[5] the hypotheses and the simplifications or approximations to be made to handle and manage uncertainties. When the data resolution is poor and different signals overlap, different deconvolution procedures are applied to extract parameters. The use of different phenomenological, mathematical and statistical models may also complicate the exact mathematical definition of limit of detection and how it is calculated. This explains why it is not easy to come to a general consensus, if any, about the precise mathematical definition of the expression of limit of detection. However, one thing is clear: it always requires a sufficient number of data (or accumulated data) and a rigorous statistical analysis to render better signification statistically.

Limit of quantification

The limit of quantification (LoQ, or LOQ) is the lowest value of a signal (or concentration, activity, response...) that can be quantified with acceptable precision and accuracy.

The LoQ is the limit at which the difference between two distinct signals / values can be discerned with a reasonable certainty, i.e., when the signal is statistically different from the background. The LoQ may be drastically different between laboratories, so another detection limit is commonly used that is referred to as the Practical Quantification Limit (PQL).

See also

References

  1. IUPAC, Compendium of Chemical Terminology, 2nd ed. (the "Gold Book") (1997). Online corrected version:  (2006–) "detection limit".
  2. "Guidelines for Data Acquisition and Data Quality Evaluation in Environmental Chemistry". Analytical Chemistry 52 (14): 2242–49. 1980. doi:10.1021/ac50064a004. 
  3. Saah AJ, Hoover DR (1998). "[Sensitivity and specificity revisited: significance of the terms in analytic and diagnostic language."]. Ann Dermatol Venereol 125 (4): 291–4. PMID 9747274. https://pubmed.ncbi.nlm.nih.gov/9747274. 
  4. 4.0 4.1 "Limit of blank, limit of detection and limit of quantitation". The Clinical Biochemist. Reviews 29 Suppl 1 (1): S49–S52. August 2008. PMC 2556583. PMID 18852857. https://www.ncbi.nlm.nih.gov/pmc/articles/2556583. 
  5. 5.0 5.1 "R: "Detection" limit for each model" (in English). search.r-project.org. https://search.r-project.org/CRAN/refmans/bioOED/html/calculate_limit.html. 
  6. "Signal enhancement on gold nanoparticle-based lateral flow tests using cellulose nanofibers". Biosensors & Bioelectronics 141: 111407. September 2019. doi:10.1016/j.bios.2019.111407. PMID 31207571. http://ddd.uab.cat/record/218082. 
  7. Long, Gary L.; Winefordner, J. D., "Limit of detection: a closer look at the IUPAC definition", Anal. Chem. 55 (7): 712A–724A, doi:10.1021/ac00258a724 

Further reading

  • "Limits for qualitative detection and quantitative determination. Application to radiochemistry". Analytical Chemistry 40 (3): 586–593. 1968. doi:10.1021/ac60259a007. ISSN 0003-2700. 

External links

Template:BranchesofChemistry Template:Authority control