Difference between revisions of "User:Shawndouglas/sandbox/sublevel4"

From LIMSWiki
Jump to navigationJump to search
 
(210 intermediate revisions by 2 users not shown)
Line 3: Line 3:
| type      = notice
| type      = notice
| style    = width: 960px;
| style    = width: 960px;
| text      = This is sublevel2 of my sandbox, where I play with features and test MediaWiki code. If you wish to leave a comment for me, please see [[User_talk:Shawndouglas|my discussion page]] instead.<p></p>
| text      = This is sublevel4 of my sandbox, where I play with features and test MediaWiki code. If you wish to leave a comment for me, please see [[User_talk:Shawndouglas|my discussion page]] instead.<p></p>
}}
}}


==Sandbox begins below==
==Sandbox begins below==
{{Infobox journal article
|name        =
|image        =
|alt          = <!-- Alternative text for images -->
|caption      =
|title_full  = CÆLIS: Software for assimilation, management, and processing data of an atmospheric measurement network
|journal      = ''Geoscientific Instrumentation, Methods and Data Systems''
|authors      = Fuertes, David; Toledano, Carlos; González, Ramiro; Berjón, Alberto; Torres, Benjamin; Cachorro, Victoria E.; de Frutos, Ángel M.
|affiliations = University of Valladolid, GRASP SAS
|contact      = Email: david at goa dot uva dot es
|editors      = Vazquez, Luis
|pub_year    = 2018
|vol_iss      = '''7'''(1)
|pages        = 67–81
|doi          = [http://10.5194/gi-7-67-2018 10.5194/gi-7-67-2018]
|issn        = 21930864
|license      = [http://creativecommons.org/licenses/by/4.0/ Creative Commons Attribution 4.0 International]
|website      = [https://www.geosci-instrum-method-data-syst.net/7/67/2018/ https://www.geosci-instrum-method-data-syst.net/7/67/2018/]
|download    = [https://www.geosci-instrum-method-data-syst.net/7/67/2018/gi-7-67-2018.pdf https://www.geosci-instrum-method-data-syst.net/7/67/2018/gi-7-67-2018.pdf] (PDF)
}}
{{ombox
| type      = content
| style    = width: 500px;
| text      = This article should not be considered complete until this message box has been removed. This is a work in progress.
}}
==Abstract==
Given the importance of atmospheric aerosols, the number of instruments and measurement networks which focus on its characterization is growing. Many challenges are derived from standardization of protocols, monitoring of instrument status to evaluate network [[Data integrity|data quality]], and manipulation and distribution of large volumes of data (raw and processed). CÆLIS is a software system which aims to simplify the management of a network, providing the scientific community a new tool for monitoring instruments, processing data in real time, and working with the data. Since 2008, CÆLIS has been successfully applied to the photometer calibration facility managed by the University of Valladolid, Spain, under the framework of the Aerosol Robotic Network (AERONET). Thanks to the use of advanced tools, this facility has been able to analyze a growing number of stations and data in real time, which greatly benefits network management and data quality control. The work describes the system architecture of CÆLIS and gives some examples of applications and data processing.
==Introduction==
Atmospheric aerosols are defined as solid or liquid particles suspended in the atmosphere. Many studies have shown the importance of aerosols, which play an important role in global energy balance and human activities. Among their direct impacts, aerosol particles produce radiative forcing in the atmosphere, provide nutrients for oceans, and affect human health. Aerosols generally produce a cooling effect, although an aerosol can also locally warm up the atmosphere depending on its type, height above the surface, and timescale under consideration. Indirectly, they change the chemical composition of clouds and therefore their radiative properties, lifetime, and precipitation. Improving knowledge about the distribution and composition of aerosols is one of the emerging challenges highlighted by the last IPCC report<ref name="IPCCClimate13">{{cite book |title=Climate Change 2013 – The Physical Science Basis: Working Group I Contribution to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change |author=Intergovernmental Panel on Climate Change |publisher=Cambridge University Press |year=2014 |isbn=9781107415324 |doi=10.1017/CBO9781107415324}}</ref>, where it is shown that they have the largest uncertainty for the estimates and interpretations of the Earth’s changing energy budget.
Ground-based and orbital instruments have been applied to monitor aerosol properties. Combining instruments is also possible to maximally exploit synergies. For example, satellites have demonstrated the potential of high spatial coverage and resolution, and standardized ground-based networks have the benefit of high accuracy. A common exercise is to validate satellite data with ground-based networks.
One of these ground-based networks is the Aerosol Robotic Network (AERONET).<ref name="HolbenAERONET98">{{cite journal |title=AERONET—A Federated Instrument Network and Data Archive for Aerosol Characterization |journal=Remote Sensing of Environment |author=Holben, B.N.; Eck, T.F.; Slutsker, I. et al. |volume=66 |issue=1 |pages=1–16 |year=1998 |doi=10.1016/S0034-4257(98)00031-5}}</ref> Led by NASA (National Aeronautics and Space Administration; http://aerosnet.gsfc.nasa.gov) and PHOTONS (PHOtométrie pour le Traitement Opérationnel de Normalisation Satellitaire; http://loaphotons.univ-lille1.fr/), AERONET is built as a federation of sub-networks with highly standardized procedures: instrument, calibration, processing, and data distribution. It was created in the 1990s with the objective of global monitoring of aerosol optical properties from the ground, as well as validating satellite retrievals of aerosols. The standard instrument used by the network is the photometer Cimel318. It is an automatic filter radiometer with a two-axis robot and nine spectral channels covering the spectral range of 340 to 1640 nm. It collects direct solar and lunar measurements, sky radiances in the almucantar and principal planes, and hybrid geometrical configurations. Once the data are validated through instrument status and cloud screening, aerosol optical depth (AOD) can be obtained as a direct product of the nine wavelengths. Using inversion algorithms<ref name="DubovikAFlex00">{{cite journal |title=A flexible inversion algorithm for retrieval of aerosol optical properties from Sun and sky radiance measurements |journal=Journal of Geophysical Research: Atmospheres |author=Dubovik, O.; King, M.D. |volume=105 |issue=D16 |pages=20673-20696 |year=2000 |doi=10.1029/2000JD900282}}</ref><ref name="DubovikApp06">{{cite journal |title=Application of spheroid models to account for aerosol particle nonsphericity in remote sensing of desert dust |journal=Journal of Geophysical Research: Atmospheres |author=Dubovik, O.; Sinyuk, A.; Lapyonok, T. et al. |volume=111 |issue=D11 |year=2006 |doi=10.1029/2005JD006619 }}</ref>, many other parameters can be retrieved, such as size distribution, complex refractive index, portion of spherical particles, and single-scattering albedo.
The Group of Atmospheric Optics at Valladolid University (GOA), Spain, is devoted to the analysis of atmospheric components by optical methods, mainly using remote sensing techniques such as spectral radiometry and lidar. One of the main tasks of the group is the management of an AERONET calibration facility since 2006, which is now—together with the University of Lille, France, and the Spanish Meteorological Agency—part of the so-called Aerosol Remote Sensing central facility of the Aerosols, Clouds, and Trace gases Research Infrastructure (ACTRIS). Since 2016, ACTRIS has been included in the road map of the European Strategy Forum for Research Infrastructures (ESFRI). The GOA calibration facility is in charge of the calibration and site monitoring of about 50 AERONET sites in Europe, North Africa, and Central America.
AERONET standards call for annual instrument calibration, maintenance, and weekly checks on the observation data. The calibration process takes about two to three months and includes post-field calibration for sun, moon, and sky channels; maintenance of the instrumentation; and pre-field calibration for the next measurement period. In order to avoid gaps in the data sets during calibration periods, frequently one instrument is swapped out with a freshly calibrated one. The network management determines where each instrument is located, what its exact configuration and calibration coefficients are, and how many days remain until the next calibration is needed. During the regular deployment period the instrument has to be regularly checked to guarantee the data quality. A routine maintenance protocol is performed by the site manager, but the network is ultimately responsible for data quality. The routine maintenance helps in reducing instrument failure and data errors, but even with the best daily protocol, instrumentation problems may occur. Data monitoring at the calibration center helps in early identification of instrument issues. However such work cannot be accomplished manually in near-real time (NRT) for a large number of sites.
In this context, it was necessary for the calibration facility at GOA to implement an automatic mechanism (in addition to the standard mechanism of AERONET) to help manage the network and facilitate weekly data checks needed to guarantee the quality of the data. The motivation of the CÆLIS system is to fulfill these two requirements. The system has to be designed to save all data, [[metadata]], and ancillary data (assimilated from other sources) in order to, on the one hand, support the management, maintenance and calibration of the network, and on the other hand, process the raw data in NRT with different algorithms and provide network managers, site managers and ultimately the scientific community with a very powerful and modern tool to analyze data produced at the observation sites. This work shows the fundamentals of the CÆLIS system—developed since 2008—both with respect to the scientific background and the information technology employed. There was no predecessor software at Valladolid, and these tasks were done manually before CÆLIS was developed. The other two AERONET calibration centers at NASA and University of Lille have their own tools. Some ideas implemented in CÆLIS are inspired by these tools.
==General architecture==
CÆLIS has been designed to run on a server which, connected to the internet, allows for external communication via a web [[Interface (computing)|interface]]. The software contains a “daemon” (a background process that offers a service) which is responsible for selecting and launching tasks. These tasks, later explained, are responsible for downloading new data whenever available and processing them. Each task reads the required input information from the database and writes the output there. Some tasks use direct internet access to retrieve data, e.g., downloading ancillary data from an FTP server. All information downloaded and treated by CÆLIS tasks is stored in the database. This allows for following actions to retrieve all information required from the database (quick extraction).
External users (organized by role with various privilege levels) can connect through the web interface to watch what tasks are being executed and explore the results of finished tasks. All actions required by the system administrators can easily be done through the web interface. Network management is also performed through the web interface, which allows for, for example, setting up the installation of an instrument at a measurement station. The same information will be used by the system when data from the instruments reach the server, and CÆLIS will compare the received information (instrument number, parameters, location, dates, etc.) with reference registers stored in the database (installation periods, configuration parameters, etc.) to know if the instrument is working properly and using the correct configuration.
External systems, such as measurement stations, can also be connected to the server and submit data. Thanks to the web interface, it can be done using port 80 (standard HTTP), which avoids many problems derived from security rules of the measurement stations and hosting institutions (some of which are in military areas).
The current system manages 120 users and 80 stations. Each station can send thousands of aerosol observations every year, and the system is constantly growing. A benchmark has been applied to confirm that the current architecture can support a network 100 times bigger, so the database can grow safely in the future.
As shown in Fig. 1, CÆLIS is composed of a database, a processing module and a web interface. These modules can be deployed independently even in different computers. The users and the stations interact with the system through the web interface. In the database, the raw data and metadata are stored, as well as the retrieved products, ancillary data, user information, etc. The NRT processing module is composed of the system daemon and a set of processing routines that extract information from the database, calculate products, and store them in the database. The web interface is the platform designed to manage the system, to manage the network, and to provide visual access to the data and metadata, with tables, plots, searching capability, etc. Each of these elements will be explained in detail in the next sections.
[[File:Fig1 Fuertes GIMDS2018 7-1.png|700px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="700px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 1.''' Diagram of CÆLIS architecture. Arrows indicate where the action is initiated (data flow is always bidirectional).</blockquote>
|-
|}
|}
==Database model==
Databases are one of the main concepts developed in the 1980s in the computer sciences. Many different approaches in terms of technology and data models have been developed with varying success. There are many types of databases, classified based on their characteristics. A database management system (DBMS) is software with an interface to a database system that provides the user with advanced characteristics such as the management of concurrency or a query language. The decision about what kind of database and which specific DBMS software to select is one of the main design decisions because all further development will be impacted by it.
Relational databases are a traditional and well-known model, and they have been successfully applied to many different fields. With relational databases, the information is organized in tables or relations which represent entity types.<ref name="ChenTheEnt76">{{cite journal |title=The entity-relationship model—toward a unified view of data |journal=ACM Transactions on Database Systems |author=Chen, P.P.-S. |volume=1 |issue=1 |pages=9–36 |year=1976 |doi=10.1145/320434.320440}}</ref> A good database modeler is able to identify those entity types that are relevant with the information that describes them. The tables or relations are composed of columns with the attributes that describe them, and rows which represent different individual entities that are identified by an unique key (one or more attributes that cannot be repeated in different rows). The tables are linked, creating a relational model. The keystone of a database is good design, which needs to take into account the information targeted for modeling as well as the way in which the data is going to be accessed (to optimize performance). Complex models with many groups of entities need to be planned in advance by creating an entity–relationship diagram. This diagram then helps final implementation of the database, which can be a direct translation of the diagram just taking some implementation decisions about a balance between data redundancy and performance.
The main elements of the entity–relationship diagram of the CÆLIS database are shown in Fig. 2. The central entity is the photometer, which produces raw data. The photometer, with given hardware configuration and calibration coefficients, is installed at one site of the network. The ancillary data for the site (e.g., meteorological data, ozone column, and surface reflectance) need to be stored. Finally the measurement stations are supported by institutions, which can also own other instruments.
[[File:Fig2 Fuertes GIMDS2018 7-1.png|700px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="700px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 2.''' Entity–relationship diagram for CÆLIS (extract of the main elements). Entities have been divided into three logic blocks.</blockquote>
|-
|}
|}
Each of these elements is in many cases representing a group of entities. For instance, calibration coefficients include extraterrestrial signal for the different solar spectral channels, radiance calibration coefficients for sky channels, coefficients for temperature correction of the signals, instrument field of view, etc. Another example is the hardware, which includes the different parts (sensor head, robot, collimator, control box, etc.), the spectral filters with the corresponding filter response, and others.
The lower part of the diagram is closely related to the network management, with an inventory of all hardware parts identified with their serial numbers and related to the institution that owns them. The upper part is related to the raw data production, and its organization is optimized for data extraction (to create products) and is consistent with the physical meaning and relevance of the quantities. The installations are manually introduced by the network managers so that any data file submitted to the system from a measurement station can be validated.
Other tables contain ancillary information that is needed to process data, such as the list of stations (including coordinates), global climatologies for certain atmospheric components (ozone, nitrogen dioxide, etc.), solar and lunar extraterrestrial irradiance spectra, and spectral absorption coefficients for several species (ozone, NO<sub>2</sub>, water vapor, etc.).
Many different DBMS can be used to implement such a model: OracleDB, SQLite, [[PostgreSQL]], etc. CÆLIS is based on a [[MySQL]] database. MySQL software is widely used by many different communities. Therefore the software is very robust, complete, stable and well documented, and it can be run in many different architectures.
The entity–relationship diagram for CÆLIS, illustrated using the model defined by Chen<ref name="ChenTheEnt76" />, is shown in Fig. 2. This diagram shows the fundamental part of the database, called layer 0. On top of that, direct products—obtained with the combination of raw data, calibration coefficients, and ancillary data—are stored. This represents “layer 1” products, physical quantities with their corresponding units and estimated uncertainties (derived from the calibration uncertainties). In our case, these products are basically aerosol optical depth, water vapor content, sky radiances, and degree of linear polarization of the sky light. On top of layer 1, there are more sophisticated products, like those derived from inversion algorithms, as well as any flags or “alarms” that are produced to help in NRT data quality control.
Layer 2 products use and combine previous layer quantities to retrieve other parameters, but no longer go down to the raw data. For instance, the inversion codes by Dubovik and King<ref name="DubovikAFlex00" /> and Nakajima ''et al.''<ref name="NakajimaUse96">{{cite journal |title=Use of sky brightness measurements from ground for remote sensing of particulate polydispersions |journal=Applied Optics |author=Nakajima, T.; Tonna, G.; Rao, R. et al. |volume=35 |issue=15 |pages=2672-86 |year=1996 |doi=10.1364/AO.35.002672 |pmid=21085415}}</ref> use spectral aerosol optical depth and sky radiances to retrieve aerosol particle size distribution, refractive indices, single-scattering albedo, etc. More advanced products that combine photometer data with other aerosol data (e.g., lidar) also belong to this group, named “layer 2” products. A clear example is the GRASP algorithm<ref name="DubovikGRASP14">{{cite web |url=http://spie.org/newsroom/5558-grasp-a-versatile-algorithm-for-characterizing-the-atmosphere?ArticleID=x109993 |title=GRASP: A versatile algorithm for characterizing the atmosphere |work=SPIE Newsroom |author=Dubovik, O.; Lapyonok, T.; Litvinov, P. et al. |publisher=SPIE |date=19 September 2014 |doi=10.1117/2.1201408.005558}}</ref> (http://www.grasp-open.com/), which is able to digest data from different sensors (satellite and ground-based, active or passive) to provide a wide set of aerosol and surface parameters. The system architecture as described here is shown in Fig. 3.
[[File:Fig3 Fuertes GIMDS2018 7-1.png|495px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="495px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 3.''' Different logic data layers. Each layer is based on the information of the previous layer.</blockquote>
|-
|}
|}
==Processing chain and near-real-time module==
CÆLIS system provides many different data products. To provide each product, some input data has to be processed in a specific way. This is what we call a “task.” The job is divided into a set of simple tasks. The system works as a state machine: one task cannot start until the previous one is finished, no matter if the second task is dependent on or independent of the previous one. When many tasks work sequentially to achieve a common objective, we create a chain of tasks. The daemon running on the server is responsible for coordinating the different tasks, as it will be explained in the next section.
The main processing chain in CÆLIS is the set of the tasks that are performed once new photometer data are uploaded into the system, as shown in Fig. 4.
[[File:Fig4 Fuertes GIMDS2018 7-1.png|475px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="475px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 4.''' Different processing chains and their relationships. An action triggers a task and then the actions bubble up.</blockquote>
|-
|}
|}
The pre-filter checks that the file uploaded to the server is a valid data file pertaining to the AERONET instruments managed by CÆLIS. If this is true, the “filter” checks that the basic information (instrument number, coordinates and dates, configuration parameters) is in accordance with the stored information about instrument installations. If any of these parameters do not correspond to its expected value, the data file is put to quarantine and the network managers receive an email notification (“send notification”). If all parameters are correct, the measurements are inserted in the database and the data file is forwarded to any desired destination: our data file repository, an AERONET server at NASA, etc. With the raw measurements inserted in the database, the processing of layer 1 products is triggered: the aerosol optical depth, water vapor, and radiances in several geometries (almucantar, principal plane, etc.). With all raw data and layer 1 products, a set of flags concerning data quality control are produced by the alarms task. These flags are produced in near-real time, as soon as new data are submitted to CÆLIS from a particular site. Since the automated [[quality control]] (QC) analysis needs all available information, the alarms task is always the last one in the processing chain. The QC flags in near-real time are a very important element in the network management. More details are given in a later section "Examples of application."
Any new implementation, for instance a new layer 1 product, needs to be inserted in the processing chain, taking into account its dependency on any other elements in the chain. The last step in a certain task is to trigger the next one in the chain.
Whenever new data are found (photometer, ancillary, other) or new information is inserted by the managers (new calibration, installation, etc.), a processing chain is triggered. The management of all chains in CÆLIS is carried out by the daemon, which is explained in detail in the next section. This kind of task organization is highly modular, so new elements in CÆLIS—either data or different instruments—can be added by creating new chains that can be connected or not to the existing ones.
The near-real-time processing module (see Fig. 1) is composed of a set of programs and libraries that are related to all the above mentioned tasks. These are programmed mainly in C for fast computation with large data sets and interoperability with other technologies, allowing for the use of other languages in the future. A Git repository is used to facilitate version control and deployment of the software.
==Daemon==
The daemon is responsible for organizing the tasks and deciding which process has to be run in each moment. It has to address different challenges, including:
# running scheduled tasks according to their priority;
# knowing which task must trigger other tasks;
# maintaining the sequence; and
# optimizing the server processing capability and running less important tasks when the CPU is idle.
The tasks are stacked in the system. Figure 5 is a representation of the stack of the tasks, where the green tasks are actions that can be executed right now and the red tasks are disabled until the “activation time” arrives.
[[File:Fig5 Fuertes GIMDS2018 7-1.png|700px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="700px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 5.''' Example of CÆLIS task stack. Ordering is related to the next task to be executed. Green indicates tasks that can be already selected to be executed. Red indicates tasks to be executed in the future (when “valid” time arrives). The example is captured at 14:00 UTC on 10 May 2017.</blockquote>
|-
|}
|}
Each task is described by the following information:
1. Group: classification of the task
2. Name: name of the task
3. Object: if it exists, defines the target where the task will be applied (for example, one file, one particular instrument, one site)
4. Date range: if defined, task can be applied to a specific time range
5. Date/time of activation: designates when the task can be run (note that some tasks can be defined to be executed in the future)
6. Priority: defines the importance of the task
Frequently, several tasks can be activated at the same moment. In these cases, the priority flag indicates to the system the order in which the tasks should be run. At the same time, the processing chains (commented in Fig. 4) used this mark to indicate the order of the tasks to the daemon. When a task is executed, if it is part of a chain, it will introduce the next actions in the stack (sorted by priority). This is the procedure to keep the system alive and always working.
In every moment, the stack contains the current tasks that can be executed right now, as well as the tasks that are scheduled to be run in the future. This is the method used by the system to repeat tasks: if a task is intended to be repeated every 15 minutes, once it is executed, the system will remove it from the stack but will add it again with the activation time set to 15 minutes later.
After a task is executed, the information on the execution is saved into a log. This allows the system administrator to study the behavior of the system, to know what has been executed, to foresee the future use of the system, and to tune the parameters of CÆLIS to balance between NRT actions and the load of the system. Figure 6 shows the log of actions and their statistics. Thanks to the log and system statistics, the system administrators can know how much time a specific task takes every day and how many times each task is executed. This information is crucial for system administrators and developers because they can analyze which tasks take more time and why (in the cases when a defined task is too slow or is called many times, etc.) and create plans to optimize the system.
[[File:Fig6 Fuertes GIMDS2018 7-1.png|930px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="930px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 6.''' Plots represent CÆLIS load from different measurements: minutes of CPU per day, number of tasks executed per day and number of photometer raw data files sent to AERONET and received from stations. Those plots are constructed based on log information. The table at the bottom shows an example of the log. Delay indicates the difference between actual start and valid time therefore indicating NRT capabilities.</blockquote>
|-
|}
|}
In a regular situation, the system works automatically. For instance, when the daemon starts, the common operations are introduced in the stack of tasks. One of these common operations will look for new data and metadata with a certain frequency (e.g., once per hour). Then, if new data are found by the corresponding task, it will add new tasks to process those data, triggering the processing chain. If there are no new data, the task will add the same task to find new data some minutes later.
The system administrators can add tasks manually and they can change the priority of the current tasks in the stack. One of the main manual tasks that the administrator can add is the “stop” action. The stop action has a duration of a few seconds and, once it finishes, it re-enters itself in the stack. This process continues until the administrator erases the task manually from the stack. Depending on the priority assigned to the stop action, the system can be completely blocked or it can allow some tasks with high priority to be undertaken. Another main action is to shut down the system. If the system administrator wants to shut down the daemon, this task should be introduced. This guarantees that the system is turned off when idle and no task is interrupted abruptly.
The system is also prepared for a sudden shutdown (for example, power outage). Given that the system only removes the tasks from the stack when they are finished, once the server is turned on, the first task to be executed will be the one that could not be completed. The fact that all these scenarios are taken into account by the stack of tasks makes CÆLIS a robust system and easy to maintain.
The system executes maintenance tasks regularly. For example, a daily backup is performed. This task is scheduled every night thanks to activation time information. The maintenance tasks can cover many different activities that need to be done regularly in the system. Other examples of maintenance tasks include the optimization of the database and regular re-booting.
The current implementation of the daemon is developed using bash scripts. This characteristic allows running tasks written in any language. It is planned to improve the current implementation by using Python and to introduce parallelism into the task chain. If this has not yet been undertaken, it is due to the relatively low load of the system. Processing in sequential mode is still sufficient to provide data in NRT. When more sophisticated algorithms will be run (such as inversion retrieval algorithms), a new implementation of the daemon will be needed. Alternatively, tasks can be launched in a server farm, allowing the system to only organize tasks while keeping its load very low. The tasks are currently implemented mainly in C because of its high performance, but any language that compiles is allowed on the server.
==Web tool==
CÆLIS offers users a web interface (http://www. caelis.uva.es). The web interface is a high-level view of the data model, and thus it shows information in real time, as soon as it is processed. The web system is secured with private access, only for registered users. During the registration process, a “role” is assigned to each user. The roles allow identifying groups of users with different permissions into the system, for example, regular users (site managers or researchers) that can access its data or a system administrator that can handle the stack of tasks of the system or reboot it.
The web interface provides high-level access to the database. It can extract and relate different data and show them all together as an online real-time report. CÆLIS has implemented many different use cases which are sufficient for common user actions. The system offers different tools depending on the role of the user. For example, a site manager can check the performance of one instrument, a network manager can modify the location of an instrument, and a system administrator can check the tasks that the NRT module is executing. The web interface allows the user to explore the database showing tables and customized plots. Some of these use cases are described below in the example section.
The web interface also allows one to query information, as well as insert new information. This is especially interesting because it constitutes the second way to insert data into the system: data inserted by the users (data can also be registered by the tasks controlled by the daemon; see previous section). In the case of users, since they work via web interface, the actions can be controlled in two senses: (1) the user has the permission to introduce the information, and (2) the information introduced is validated. Moreover, the fact that manual information is registered by means of the web interface allows the system to launch “derived tasks” for an specific action. For example, when a new measurement site is created, the system can add the action of “insert climatology data for the new coordinates.” Everything is triggered automatically and controlled by the logic implemented in the web interface.
The web interface has been developed using PHP through the Symfony framework, Bootstrap as the CSS framework, and jQuery as the JavaScript framework (used for asynchronous actions and to make the interface more dynamic). The choice for PDO (PHP Data Object) is Propel. Every technology selected in development has been highly studied:
* '''PHP''' is a widely developed language for web development. It shows very good performance, and popular websites have been developed using it. It also has a large community behind it, which offers helpful support. The ecosystem (libraries implemented to be used with PHP) is one of the best for web computing and includes libraries for graphical representation, mathematics, etc.
* '''Symfony framework''' at present is for quick development, though this framework is also designed for heavy development. This allows developers to stay focused on main issues and reduce the complexity of common tasks: user management, database access, etc. The selection of Symfony over other alternatives was because it became very popular at the moment web interface development began and the community was very active. Moreover, it is easy to use, contains hundreds of libraries, is powerful and flexible, and it performs well.
* '''JavaScript''' is used for asynchronous connections with the server in order to offer a very dynamic interface. JavaScript is undisputedly the best for this purpose.
* '''jQuery''' is used as a JavaScript framework. There are alternatives, but jQuery is very well integrated with Symfony, it is widely used, and it fulfills all system requirements.
* '''Propel''' was selected for the PDO library because it allows one to have primary keys with multiple fields. The CÆLIS database has been deeply optimized, and the use of multi-field primary keys greatly improves performance in comparison with an auto increment ID (common alternative). At decision time, only Propel managed such kind of keys.
The web interface has been structured and designed to grow, being able to add the management of other measurement networks or scientific projects. Those projects can share the core of the developed code. This allows one to start projects from existing code instead of starting from scratch, making it quick to add new features. For example, one of these common utilities is the plotting tool. CÆLIS has a very powerful and flexible tool to plot the data. The tool is implemented on the server side using PHP. The benefit of this approach is that, when plotting very huge pieces of data, the plotting is still quick since the data is not transferred to the client (the web browser). Instead, the plot is created in the server, and only some tens of kilobytes are transferred to the users. This solution is optimal for treating large pieces of information. A disadvantage of this approach is that the result is less dynamic than an implementation on the client side.
The web interface also implements web services for machine-to-machine communication. These web services allow one to perform common operations such as data transfer from the measurement sites to the server. A great advantage of this approach is that even well-secured external machines can connect to a HTTP server. In some cases, e.g., instruments installed in military bases, they need special permission to set up internal proxies and allow access to the system, but it is widely accepted that HTTP protocol on port 80 can be used everywhere. Alternatively, CÆLIS can offer other data transfer alternatives, such as FTP or email, but the most common is to use the web service.
==Examples of application==
In order to illustrate the capabilities of the system, we will now show a set of examples, focusing on the typical needs of the different users: site managers, network managers (calibration center), and researchers.
===Site manager use case===
Site managers are interested in knowing the status of their instruments and retrieving general information about the instruments and their sites. This example will illustrate how a site manager can access all this information.
CÆLIS offers access to all metadata related to each instrument: calibration coefficients, temperature corrections, configuration parameters, filters, etc. The metadata are in general different for each deployment period.
Figure 7 shows the description of the photometer #783 of the AERONET network (registered in CÆLIS and calibrated by GOA). There are three blocks of information: (1) metadata such as calibration coefficients or configuration, (2) network management information such as deployment periods (sites, dates), and (3) administrative information such as hardware inventory of all parts of the instrument.
[[File:Fig7 Fuertes GIMDS2018 7-1.png|620px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="620px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 7.''' Screenshot (cut and abbreviated) of the photometer description for AERONET photometer #783.</blockquote>
|-
|}
|}
The continuity of the data sets is an important issue in AERONET. In order to avoid (or minimize) data gaps, often a calibrated instrument is sent to a site to replace an instrument that needs to be calibrated. Hence a number of instruments are rotating inside the network, from site to site. This fact makes it difficult to monitor which instrument is where. CÆLIS offers the information about each site, with the list of instruments and deployment dates in that particular site. This is all easily accessible to site managers.
The illustration in Fig. 8 shows the information of the measurement site placed in Madrid, Spain. Some general information is shown on top of the page, followed by the list of instruments and measurement periods. This information is linked with the instrument information showed in the previous figure so that it is very easy to browse all the information.
[[File:Fig8 Fuertes GIMDS2018 7-1.png|700px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="700px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 8.''' Screenshot of the Madrid site description.</blockquote>
|-
|}
|}
===Network manager use case===
One of the most important applications of CÆLIS is real-time data monitoring. This information is used by the network managers (as well as site managers), and it provides useful information about instrument performance. The biggest benefit of this powerful tool is that it allows identifying problems in the instrumentation as soon as they appear, raising a flag automatically. The network managers at the calibration center can send a warning to the site managers. If site managers can solve problems quickly, this is of great benefit to data quality and continuity, and thus network quality. The calibration center is continuously monitoring this information in order to warn and assist the site managers if a problem is not quickly solved.
When the system receives new data files from a measurement site, it processes the data, generating new products. From raw measurements it calculates the aerosol optical depth, sky radiances, and other products. The last product in the processing chain triggered by the arrival of new data is the alarms. This product studies the new data, metadata, and derived products in order to identify malfunctions in the instrumentation.
Figure 9 shows a screenshot of the CÆLIS web interface, where we can see the status of a specific site over six days (this is a calibration site so it has more than one instrument). This page can be customized thanks to filters (sites, instruments, dates, etc. ). Finally, useful information can be obtained by simply clicking on specific places. For example, when the photometer number is clicked, instrument status information is shown in graphs (battery voltage, internal temperature, etc.), and when a specific day is selected the user can explore all information (raw data, products) received and processed for that particular day.
[[File:Fig9 Fuertes GIMDS2018 7-1.png|700px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="700px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 9.''' View of real-time flags (alarms) for a specific day at Valladolid site. A zoomed-in view shows how the signal of a good day appears and how a problem is automatically identified. Specifically, photometer #618 on 10 April 2017 has an almucantar where the sun is not in the center (usually from cable tangling).</blockquote>
|-
|}
|}
Every day, calibration centers need to solve instrument issues and multiple questions, and this is only possible thanks to deep knowledge of the instrumentation. CÆLIS helps with routine problems and provides very useful information about the data contained in the database. In the case of new issues, the system offers a data viewer which allows one to customize the data to be displayed in a very flexible and powerful way.
Figure 10 shows setting up a specific case in which data from different sources are shown in the same plot in order to help the network manager to understand the problem. We can select one or multiple variables (all available raw data and data products) from one or multiple instruments, and display them for a particular date/time range, with full flexibility in plot configuration (colors, axis, etc.). Specifically, the example shows battery voltage and robot errors. The plot clearly indicates that the power supply stopped working; therefore, the battery is losing charge and the robot cannot operate normally and returns robot errors coinciding with the decreasing battery voltage trend.
[[File:Fig10 Fuertes GIMDS2018 7-1.png|800px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="800px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 10.''' An example of data viewer where an instrumental error can be identified. The power supply is disconnected and the internal battery decreases. When energy it is not enough to move the robot, “status b” (robot) errors appear. Each orange vertical line represents a status error in an specific time.</blockquote>
|-
|}
|}
===Research applications===
The following example will show a specific project of the research group with the aim of studying the AERONET database. As part of this project, a relational database which organizes AERONET data was created. By querying this database the user can answer general questions about AERONET sites, including the following: since when has an AERONET site been active? How much data (and at what quality) does a site have? This project reuses at maximum the core of the system (user access, plot tools, etc.) and lets the developers create a new tool quickly. CÆLIS has been used as “framework” for data analysis. The effort required to develop this system is far less than starting from scratch. The features needed in this development are a tool to assimilate the new data and the specific views that show the results. Additionally, CÆLIS can re-use the database added here in other projects.
Figure 11 shows the automated aerosol data analysis of an AERONET site (Palencia, Spain). In it, we can see the data coverage (for level 1.0, 1.5 and 2.0 of the AERONET database), monthly statistics of aerosol optical depth and Ångström exponent, frequency histograms, and an AOD vs. AE scatter plot providing basic aerosol type classification. These plots provide a general overview of the site characteristics in terms of data coverage, aerosol statistics and type classification, which can be used as a first approach in order to select a site for some particular study. Then, based on this general information, the researcher can ask other questions that can be solved by interrogating the database directly.
[[File:Fig11 Fuertes GIMDS2018 7-1.png|698px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="698px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 11.''' Statistical analysis of aerosol optical depth (AOD) and Ångström exponent (AE) derived from AERONET for Palencia site in 2016: '''(a)''' data coverage for level 1.0, 1.5 and 2.0 of the AERONET database; '''(b)''' aerosol type classification based on the AOD vs. AE scatter plot. '''(c)''' AOD (440 nm) monthly statistics using box plot; '''(d)''' frequency histogram for AOD (440 nm); '''(e)''' AE monthly statistics using box plot; '''(f)''' frequency histogram for AE.</blockquote>
|-
|}
|}
In order to illustrate how it can be done, the next part of the example will show how to identify special aerosol events at the Palencia AERONET site. For this purpose, we need to explore the CÆLIS database. The starting point will be the following questions: how many days of high turbidity occur at the site? How many of these can be classified as desert dust events? To answer these questions we will use the previous (overall) climatology and we will make some assumptions. First, we are going to assume that “a high-turbidity event” takes place when the AOD average is larger than the climatological mean plus two standard deviations (SD). Based on that assertion, we can write an SQL query that is launched in the system database and will review every single value and return the result. To create this SQL query, it is important to have accurate knowledge of the database model in order to obtain the expected results and within a reasonable time. For our particular case, we will use the cml-aod table which contains the general information about each AOD measurement, and the cml-aod channel which contains AOD information about each specific wavelength channel. First, we check how many days with AOD measurements we have for the Palencia AERONET site:
<tt>SELECT COUNT (a. ‘date‘) FROM
‘cml-aod‘ a WHERE station =
'Palencia' GROUP BY DATE (a. ‘date‘)
Result = 2730</tt>
Now, we can filter this result by checking during how many days does the AOD (440 nm wavelength) have at least 10 observation points greater than 0.31 (climatological average = 0.13 and SD = 0.09):
<tt>SELECT DATE (a. ‘date‘)
FROM ‘cml-aod‘ a LEFT JOIN
‘cml-aod-channel‘ c
ON a. ‘ph‘ = c. ‘ph‘ AND
a. ‘date‘ = c. ‘date‘ WHERE
station = 'Palencia'
and c. ‘wln‘ = 0.44 and c. ‘aod‘ > 0.31
and cloud-screening-v2 = 'cloud-free'
GROUP BY DATE (a. ‘date‘) HAVING
COUNT(∗) > 10 ORDER BY DATE (a. ‘date‘)
Result 285 days: 12 February 2004,
14 February 2004, . . . , 14 March 2017</tt>
Finally, we make another assumption: a desert dust event must have a low Ångström exponent value, lower than the average minus two times the standard deviation (climatological average = 1.29, SD = 0.37):
<tt>SELECT DATE (a. ‘date‘)
FROM ‘cml-aod‘ a LEFT JOIN
‘cml-aod-channel‘ c
ON a. ‘ph‘ = c. ‘ph‘ AND
a. ‘date‘ = c. ‘date‘ WHERE


station = 'Palencia'
*Discussion and practical use of [[artificial intelligence]] (AI) in the [[laboratory]] is, perhaps to the surprise of some, not a recent phenomena. In the mid-1980s, researchers were developing computerized AI systems able "to develop automatic decision rules for follow-up analysis of &#91;[[clinical laboratory]]&#93; tests depending on prior information, thus avoiding the delays of traditional sequential testing and the costs of unnecessary parallel testing."<ref>{{Cite journal |last=Berger-Hershkowitz |first=H. |last2=Neuhauser |first2=D. |date=1987 |title=Artificial intelligence in the clinical laboratory |url=https://www.ccjm.org/content/54/3/165 |journal=Cleveland Clinic Journal of Medicine |volume=54 |issue=3 |pages=165–166 |doi=10.3949/ccjm.54.3.165 |issn=0891-1150 |pmid=3301059}}</ref> In fact, discussion of AI in general was ongoing even in the mid-1950s.<ref name="MinskyHeuristic56">{{cite book |url=https://books.google.com/books?hl=en&lr=&id=fvWNo6_IZGUC&oi=fnd&pg=PA1 |title=Heuristic Aspects of the Artificial Intelligence Problem |author=Minsky, M. |publisher=Ed Services Technical Information Agency |date=17 December 1956 |accessdate=16 February 2023}}</ref><ref>{{Cite journal |last=Minsky |first=Marvin |date=1961-01 |title=Steps toward Artificial Intelligence |url=http://ieeexplore.ieee.org/document/4066245/ |journal=Proceedings of the IRE |volume=49 |issue=1 |pages=8–30 |doi=10.1109/JRPROC.1961.287775 |issn=0096-8390}}</ref>


and c. ‘wln‘ = 0.44 and c. ‘aod‘ > 0.31
*Hiring demand for laboratorians with AI experience (2015–18) has historically been higher in non-healthcare industries, such as manufacturing, mining, and agriculture, shedding a light on how AI adoption in the clinical setting may be lacking. According to the Brookings Institute, "Even for the relatively-skilled job postings in hospitals, which includes doctors, nurses, medical technicians, research lab workers, and managers, only approximately 1 in 1,250 job postings required AI skills." They add: "AI adoption may be slow because it is not yet useful, or because it may not end up being as useful as we hope. While our view is that AI has great potential in health care, it is still an open question."<ref name=":11">{{Cite web |last=Goldfarb, A.; Teodoridis, F. |date=09 March 2022 |title=Why is AI adoption in health care lagging? |work=Series: The Economics and Regulation of Artificial Intelligence and Emerging Technologies |url=https://www.brookings.edu/research/why-is-ai-adoption-in-health-care-lagging/ |publisher=Brookings Institute |accessdate=17 February 2023}}</ref>


and a. ‘alpha-440-870‘ < 0.55
*Today, AI is being practically used in not only clinical diagnostic laboratories but also clinical research labs, life science labs, and research and development (R&D) labs, and more. Practical uses of AI can be found in:


GROUP BY DATE (a. ‘date‘) HAVING
:clinical research labs<ref name=":0">{{Cite journal |last=Damiani |first=A. |last2=Masciocchi |first2=C. |last3=Lenkowicz |first3=J. |last4=Capocchiano |first4=N. D. |last5=Boldrini |first5=L. |last6=Tagliaferri |first6=L. |last7=Cesario |first7=A. |last8=Sergi |first8=P. |last9=Marchetti |first9=A. |last10=Luraschi |first10=A. |last11=Patarnello |first11=S. |date=2021-12-07 |title=Building an Artificial Intelligence Laboratory Based on Real World Data: The Experience of Gemelli Generator |url=https://www.frontiersin.org/articles/10.3389/fcomp.2021.768266/full |journal=Frontiers in Computer Science |volume=3 |pages=768266 |doi=10.3389/fcomp.2021.768266 |issn=2624-9898}}</ref>
:hospitals<ref name=":0" /><ref name=":1">{{Cite journal |last=University of California, San Francisco |last2=Adler-Milstein |first2=Julia |last3=Aggarwal |first3=Nakul |last4=University of Wisconsin-Madison |last5=Ahmed |first5=Mahnoor |last6=National Academy of Medicine |last7=Castner |first7=Jessica |last8=Castner Incorporated |last9=Evans |first9=Barbara J. |last10=University of Florida |last11=Gonzalez |first11=Andrew A. |date=2022-09-29 |title=Meeting the Moment: Addressing Barriers and Facilitating Clinical Adoption of Artificial Intelligence in Medical Diagnosis |url=https://nam.edu/meeting-the-moment-addressing-barriers-and-facilitating-clinical-adoption-of-artificial-intelligence-in-medical-diagnosis |journal=NAM Perspectives |volume=22 |issue=9 |doi=10.31478/202209c |pmc=PMC9875857 |pmid=36713769}}</ref>
:medical diagnostics labs<ref name=":1" /><ref name=":12">{{Cite web |last=Government Accountability Office (GAO); National Academy of Medicine (NAM) |date=September 2022 |title=Artificial Intelligence in Health Care: Benefits and Challenges of Machine Learning Technologies for Medical Diagnostics |url=https://www.gao.gov/assets/gao-22-104629.pdf |format=PDF |publisher=Government Accountability Office |accessdate=16 February 2023}}</ref><ref name=":13">{{Cite journal |last=Wen |first=Xiaoxia |last2=Leng |first2=Ping |last3=Wang |first3=Jiasi |last4=Yang |first4=Guishu |last5=Zu |first5=Ruiling |last6=Jia |first6=Xiaojiong |last7=Zhang |first7=Kaijiong |last8=Mengesha |first8=Birga Anteneh |last9=Huang |first9=Jian |last10=Wang |first10=Dongsheng |last11=Luo |first11=Huaichao |date=2022-09-24 |title=Clinlabomics: leveraging clinical laboratory data by data mining strategies |url=https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-022-04926-1 |journal=BMC Bioinformatics |language=en |volume=23 |issue=1 |pages=387 |doi=10.1186/s12859-022-04926-1 |issn=1471-2105 |pmc=PMC9509545 |pmid=36153474}}</ref><ref name=":7">{{Cite journal |last=DeYoung |first=B. |last2=Morales |first2=M. |last3=Giglio |first3=S. |date=2022-08-04 |title=Microbiology 2.0–A “behind the scenes” consideration for artificial intelligence applications for interpretive culture plate reading in routine diagnostic laboratories |url=https://www.frontiersin.org/articles/10.3389/fmicb.2022.976068/full |journal=Frontiers in Microbiology |volume=13 |pages=976068 |doi=10.3389/fmicb.2022.976068 |issn=1664-302X |pmc=PMC9386241 |pmid=35992715}}</ref><ref name=":5">{{Cite web |last=Schut, M. |date=01 December 2022 |title=Get better with bytes |url=https://www.amsterdamumc.org/en/research/news/get-better-with-bytes.htm |publisher=Amsterdam UMC |accessdate=16 February 2023}}</ref><ref name="AlbanoCal19">{{cite web |url=https://physicianslab.com/calculations-to-diagnosis-the-artificial-intelligence-shift-thats-already-happening/ |title=Calculations to Diagnosis: The Artificial Intelligence Shift That’s Already Happening |author=Albano, V.; Morris, C.; Kent, T. |work=Physicians Lab |date=06 December 2019 |accessdate=16 February 2023}}</ref>
:chromatography labs<ref name="AlbanoCal19" />
:biology and life science labs<ref name=":6">{{Cite journal |last=de Ridder |first=Dick |date=2019-01 |title=Artificial intelligence in the lab: ask not what your computer can do for you |url=https://onlinelibrary.wiley.com/doi/10.1111/1751-7915.13317 |journal=Microbial Biotechnology |language=en |volume=12 |issue=1 |pages=38–40 |doi=10.1111/1751-7915.13317 |pmc=PMC6302702 |pmid=30246499}}</ref>
:medical imaging centers<ref name="Brandao-de-ResendeAIWeb22">{{cite web |url=https://siim.org/page/22w_clinical_adoption_of_ai |title=AI Webinar: Clinical Adoption of AI Across Image Producing Specialties |author=Brandao-de-Resende, C.; Bui, M.; Daneshjou, R. et al. |publisher=Society for Imaging Informatics in Medicine |date=11 October 2022}}</ref>
:ophthalmology clinics<ref>{{Cite journal |last=He |first=Mingguang |last2=Li |first2=Zhixi |last3=Liu |first3=Chi |last4=Shi |first4=Danli |last5=Tan |first5=Zachary |date=2020-07 |title=Deployment of Artificial Intelligence in Real-World Practice: Opportunity and Challenge |url=https://journals.lww.com/10.1097/APO.0000000000000301 |journal=Asia-Pacific Journal of Ophthalmology |language=en |volume=9 |issue=4 |pages=299–307 |doi=10.1097/APO.0000000000000301 |issn=2162-0989}}</ref>
:reproduction clinics<ref name=":9">{{Cite journal |last=Trolice |first=Mark P. |last2=Curchoe |first2=Carol |last3=Quaas |first3=Alexander M |date=2021-07 |title=Artificial intelligence—the future is now |url=https://link.springer.com/10.1007/s10815-021-02272-4 |journal=Journal of Assisted Reproduction and Genetics |language=en |volume=38 |issue=7 |pages=1607–1612 |doi=10.1007/s10815-021-02272-4 |issn=1058-0468 |pmc=PMC8260235 |pmid=34231110}}</ref><ref name="ESHREArti22">{{cite web |url=https://www.focusonreproduction.eu/article/ESHRE-News-22AI |title=Annual Meeting 2022: Artificial intelligence in embryology and ART |author=European Society of Human Reproduction and Embryology |work=Focus on Reproduction |date=06 July 2022 |accessdate=16 February 2023}}</ref><ref name="HinckleyApply21">{{cite web |url=https://rscbayarea.com/blog/applying-ai-for-better-ivf-success |title=Applying AI (Artificial Intelligence) in the Lab for Better IVF Success |author=Hinckley, M. |work=Reproductive Science Center Blog |publisher=Reproductive Science Center of the Bay Area |date=17 March 2021 |accessdate=16 February 2023}}</ref>
:digital pathology labs<ref name="YousifArt21">{{cite web |url=https://clinlabint.com/artificial-intelligence-is-the-key-driver-for-digital-pathology-adoption/ |title=Artificial intelligence is the key driver for digital pathology adoption |author=Yousif, M.; McClintock, D.S.; Yao, K. |work=Clinical Laboratory Int |publisher=PanGlobal Media |date=2021 |accessdate=16 February 2023}}</ref>
:material testing labs<ref name=":2">{{Cite journal |last=MacLeod |first=B. P. |last2=Parlane |first2=F. G. L. |last3=Morrissey |first3=T. D. |last4=Häse |first4=F. |last5=Roch |first5=L. M. |last6=Dettelbach |first6=K. E. |last7=Moreira |first7=R. |last8=Yunker |first8=L. P. E. |last9=Rooney |first9=M. B. |last10=Deeth |first10=J. R. |last11=Lai |first11=V. |date=2020-05-15 |title=Self-driving laboratory for accelerated discovery of thin-film materials |url=https://www.science.org/doi/10.1126/sciadv.aaz8867 |journal=Science Advances |language=en |volume=6 |issue=20 |pages=eaaz8867 |doi=10.1126/sciadv.aaz8867 |issn=2375-2548 |pmc=PMC7220369 |pmid=32426501}}</ref><ref name=":3">{{Cite journal |last=Chibani |first=Siwar |last2=Coudert |first2=François-Xavier |date=2020-08-01 |title=Machine learning approaches for the prediction of materials properties |url=http://aip.scitation.org/doi/10.1063/5.0018384 |journal=APL Materials |language=en |volume=8 |issue=8 |pages=080701 |doi=10.1063/5.0018384 |issn=2166-532X}}</ref><ref name="MullinTheLab21">{{Cite journal |last=Mullin, R. |date=28 March 2021 |title=The lab of the future is now |url=http://cen.acs.org/business/informatics/lab-future-ai-automated-synthesis/99/i11 |journal=Chemical & Engineering News |volume=99 |issue=11 |archiveurl=https://web.archive.org/web/20220506192926/http://cen.acs.org/business/informatics/lab-future-ai-automated-synthesis/99/i11 |archivedate=06 May 2022 |accessdate=16 February 2023}}</ref>
:chemical experimentation and molecular discovery labs<ref name="MullinTheLab21" /><ref name=":4">{{Cite journal |last=Burger |first=Benjamin |last2=Maffettone |first2=Phillip M. |last3=Gusev |first3=Vladimir V. |last4=Aitchison |first4=Catherine M. |last5=Bai |first5=Yang |last6=Wang |first6=Xiaoyan |last7=Li |first7=Xiaobo |last8=Alston |first8=Ben M. |last9=Li |first9=Buyi |last10=Clowes |first10=Rob |last11=Rankin |first11=Nicola |date=2020-07-09 |title=A mobile robotic chemist |url=https://www.nature.com/articles/s41586-020-2442-2.epdf?sharing_token=HOkIS6P5VIAo2_l3nRELmdRgN0jAjWel9jnR3ZoTv0Nw4yZPDO1jBpP52iNWHbb8TakOkK906_UHcWPTvNxCmzSMpAYlNAZfh29cFr7WwODI2U6eWv38Yq2K8odHCi-qwHcEDP18OjAmH-0KgsVgL5CpoEaQTCvbmhXDSyoGs6tIMe1nuABTeP58z6Ck3uULcdCtVQ66X244FsI7uH8GnA%3D%3D&tracking_referrer=cen.acs.org |journal=Nature |language=en |volume=583 |issue=7815 |pages=237–241 |doi=10.1038/s41586-020-2442-2 |issn=0028-0836}}</ref><ref name="LemonickExplore20">{{Cite journal |last=Lemonick, S. |date=06 April 2020 |title=Exploring chemical space: Can AI take us where no human has gone before? |url=https://cen.acs.org/physical-chemistry/computational-chemistry/Exploring-chemical-space-AI-take/98/i13 |journal=Chemical & Engineering News |volume=98 |issue=13 |archiveurl=https://web.archive.org/web/20200729004137/https://cen.acs.org/physical-chemistry/computational-chemistry/Exploring-chemical-space-AI-take/98/i13 |archivedate=29 July 2020 |accessdate=16 February 2023}}</ref>
:quantum physics labs<ref name="DoctrowArti19">{{cite web |url=https://www.pnas.org/post/podcast/artificial-intelligence-laboratory |title=Artificial intelligence in the laboratory |author=Doctrow, B. |work=PNAS Science Sessions |date=16 December 2019 |accessdate=16 February 2023}}</ref>


COUNT(∗) > 10 ORDER BY DATE (a. ‘date‘)
*What's going on in these labs?


:'''Materials science''': The creation of "a modular robotic platform driven by a model-based optimization algorithm capable of autonomously optimizing the optical and electronic properties of thin-film materials by modifying the film composition and processing conditions ..."<ref name=":2" />
:'''Materials science''': "Most of the applications of [machine learning (ML)] in chemical and materials sciences, as we have said, feature supervised learning algorithms. The goal there is to supplement or replace traditional modeling methods, at the quantum chemical or classical level, in order to predict the properties of molecules or materials directly from their structure or their chemical composition ... Our research group was applying the same idea on a narrower range of materials, trying to confirm that for a given chemical composition, geometrical descriptors of a material’s structure could lead to accurate predictions of its mechanical features."<ref name=":3" />
:'''Life science''': "In biological experiments, we generally cannot as easily declare victory, but we can use the systems biology approach of cycling between experimentation and modelling to see which sequences, when tested, are most likely to improve the model. In artificial intelligence, this is called active learning, and it has some similarity to the way in which we as humans learn as infants: we get some help from parents and teachers, but mainly model the world around us by exploring it and interacting with it. Ideally then, we would recreate such an environment for our machine learning algorithms in the laboratory, where we start with an initial ‘infant’ model of a certain regulatory system or protein function and let the computer decide what sequence designs to try out – a deep learning version of the ‘robot scientist’. Microbes are ideal organisms for such an approach, given the ease and speed with which they can be grown and genetically manipulated. Combined with laboratory automation, many microbial experiments can (soon) be performed with minimal human intervention, ranging from strain construction and screening, such as operated by Amyris, Gingko, Transcriptic, etc., to full-genome engineering or even the design of microbial ecologies."<ref name=":6" />
:'''Digital pathology''': "The collaboration combines two AI solutions, VistaPath’s Sentinel, the world’s first automated tissue grossing platform, and Gestalt’s AI Requisition Engine (AIRE), a leading-edge AI algorithm for accessioning, to raise the bar in AI-driven pathology digitization. Designed to make tissue grossing faster and more accurate, VistaPath’s Sentinel uses a high-quality video system to assess specimens and create a gross report 93% faster than human technicians with 43% more accuracy. It not only improves on quality by continuously monitoring the cassette, container, and tissue to reduce mislabeling and specimen mix-up, but also increases traceability by retaining original images for downstream review."<ref>{{Cite web |last=VistaPath |date=28 July 2022 |title=VistaPath Launches New Collaboration with Gestalt Diagnostics to Further Accelerate Pathology Digitization |work=PR Newswire |url=https://www.prnewswire.com/news-releases/vistapath-launches-new-collaboration-with-gestalt-diagnostics-to-further-accelerate-pathology-digitization-301594718.html |publisher=Cision US Inc |accessdate=17 February 2023}}</ref>
:'''Chemistry and molecular science''': "The benefits of combining automated experimentation with a layer of artificial intelligence (AI) have been demonstrated for flow reactors, photovoltaic films, organic synthesis, perovskites and in formulation problems. However, so far no approaches have integrated mobile robotics with AI for chemical experiments. Here, we built Bayesian optimization into a mobile robotic workflow to conduct photocatalysis experiments within a ten-dimensional space."<ref name=":4" />
:'''Chemistry and immunology''': "Chemistry and immunology laboratories are particularly well-suited to leverage machine learning because they generate large, highly structured data sets, Schulz and others wrote in a separate review paper. Labor-intensive processes used for interpretation and quality control of electrophoresis traces and mass spectra could benefit from automation as the technology improves, they said. Clinical chemistry laboratories also generate digital images—such as urine sediment analysis—that may be highly conducive to semiautomated analyses, given advances in computer vision, the paper noted."<ref name=":8">{{Cite web |last=Blum, K. |date=01 January 2023 |title=A Status Report on AI in Laboratory Medicine |work=Clinical Laboratory News |url=https://www.aacc.org/cln/articles/2023/janfeb/a-status-report-on-ai-in-laboratory-medicine |publisher=American Association for Clinical Chemistry |accessdate=17 February 2023}}</ref>
:'''Clinical research''': "... retrospective analysis of existing patient data for descriptive and clustering purposes [and] automation of knowledge extraction, ranging from text mining, patient selection for trials, to generation of new research hypotheses ..."<ref name=":0" />
:'''Clinical research''': "AI ... offers a further layer to the laboratory system by analyzing all experimental data collected by experiment devices, whether it be a sensor or a collaborative robot. From data collected, AI is able to produce hypotheses and predict which combination of materials or temperature is desired for the experiment. In short, this system will allow scientists to be aided by a highly intelligent system which is constantly monitoring and analyzing the experimental output. In this way, AI will help an experiment from its inception to conclusion."<ref>{{Cite web |last=Chubb, P. |date=03 November 2020 |title=How disruptive technology is helping laboratories combat COVID-19 |url=https://datafloq.com/read/disruptive-technologies-lab-help-us-prepare-future-pandemics/ |publisher=Datafloq |accessdate=16 February 2023}}</ref>
:'''Clinical research/medical diagnostics''': "Artificial intelligence (AI) in the laboratory is primarily used to make sense of big data, the almost impossibly large sets of data that biologists and pharmaceutical R&D teams are accustomed to working with. AI algorithms can parse large amounts of data in a short amount of time and turn that data into visualizations that viewers can easily understand. In certain data-intensive fields, such as genomic testing and virus research, AI algorithms are the best way to sort through the data and do some of the pattern recognition work."<ref>{{Cite web |last=Stewart, B. |date=18 March 2021 |title=Using LIMS for Data Visualization |work=CSols Insights |url=https://www.csolsinc.com/insights/published-articles/using-lims-for-data-visualization/ |publisher=CSols, Inc |accessdate=17 February 2023}}</ref>
:'''Medical diagnostics''': Development and implementation of [[Clinical decision support system|clinical decision support systems]] <ref name=":0" /><ref name=":1" />
:'''Medical diagnostics''': "Finally, in the laboratory, AI reduces the number of unnecessary blood samples when diagnosing infection. Instead of the 'gold standard blood sample' that takes 24-72 hours, the algorithm can predict the outcome of the blood sample with almost 80% accuracy based on demographics, vital signs, medications, and laboratory and radiology results. These are all examples of how Artificial Intelligence can be used to test better and faster with information that already exists. This saves time and costs."<ref name=":5" />
:'''Medical diagnostics''': "Chang sees two overarching classes of AI models: those that tackle internal challenges in the lab, such as how to deliver more accurate results to clinicians; and those that seek to identify cohorts of patients and care processes to close quality gaps in health delivery systems. The lab, however, 'isn’t truly an island,' said Michelle Stoffel, MD, PhD, associate chief medical information officer for laboratory medicine and pathology at M Health Fairview and the University of Minnesota in Minneapolis. 'When other healthcare professionals are working with electronic health records or other applications, there could be AI-driven tools, or algorithms used by an institution’s systems that may draw on laboratory data.'"<ref name=":8" />
:'''Medical diagnostics''': AI is used for the formulation of reference ranges, improvement of quality control, and automated interpretation of results. "Continuous monitoring of specimen acceptability, collection and transport can result in the prompt identification and correction of problems, leading to improved patient care and a reduction in unnecessary redraws and delays in reporting results."<ref name=":13" />
:'''Reproduction science''': "The field of AI is the marriage of humans and computers while reproductive medicine combines clinical medicine and the scientific laboratory of embryology. The application of AI has the potential to disconnect healthcare professionals from patients through algorithms, automated communication, and clinical imaging. However, in the embryology laboratory, AI, with its focus on gametes and embryos, can avoid the same risk of distancing from the patient. Areas of application of AI in the laboratory would be to enhance and automate embryo ranking through analysis of images, the ultimate goal being to predict successful implantation. Might such a trend obviate the need for embryo morphological assessment, time-lapse imaging and preimplantation genetic testing for aneuploidy (PGT-A), including mosaicism. Additionally, AI could assist with automation through analysis of testicular sperm samples searching for viable gametes, embryo grading uniformity."<ref name=":9" />
:'''Chromatography-heavy sciences''': " A great example of this is AI in the Liquid Chromatography Mass Spectrometry (LC-MS) field. LC-MS is a great tool used to measure various compounds in the human body, including everything from hormone levels to trace metals. One of the ways AI has already integrated with LC-MS is how it cuts down on the rate limiting steps of LC-MS, which more often than not are sample prep and LC separations. One system that Physicians Lab has made use of is parallel processing using SCIEX MPX 2.0 High Throughput System. This system can couple parallel runs with one LCMS instrument, resulting in twice the speed with no loss to accuracy. It can do this by staggering two runs either using the same method, or different methods entirely. What really makes this system great is its ability to automatically detect carryover and inject solvent blanks to clean the instrument. The system will then continue its analyzing, while automatically reinjecting samples that may be affected by the carryover. It will also flag high concentration without user input, allowing for easy detection of possibly faulty samples. This allows it to operate without users from startup to shut down. Some of the other ways that it can be used to increase efficiency are by using integrated network features to work on anything from streamlining management to increased throughput."<ref name="AlbanoCal19" />
:'''Most any lab''': "Predictive analytics, for example, is one tool that the Pistoia Alliance is using to better understand laboratory instruments and how they might fail over time... With the right data management strategies and careful consideration of metadata, how to best store data so that it can be used in future AI and ML workflows is essential to the pursuit of AI in the laboratory. Utilizing technologies such as LIMS and ELN enables lab users to catalogue data, providing context and instrument parameters that can then be fed into AI or ML systems. Without the correct data or with mismatched data types, AI and ML will not be possible, or at the very least, could provide undue bias trying to compare data from disparate sources."<ref>{{Cite web |date=29 January 2021 |title=Data Analytics |work=Scientific Computing World - Building a Smart Laboratory 2020 |url=https://www.scientific-computing.com/feature/data-analytics-0 |publisher=Europa Science Ltd |accessdate=17 February 2023}}</ref>
:'''Most any lab''': "When the actionable items are automatically created by Optima, the 'engine' starts working. An extremely sophisticated algorithm is able to assign the tasks to the resources, both laboratory personnel and instruments, according to the system configuration. Optima, thanks to a large amount of time dedicated to research the best way to automate this critical process, is able to automate most of the lab resource scheduling."<ref>{{Cite web |last=Optima Team |date=15 December 2020 |title=The concept of machine learning applied to lab resources scheduling |work=Optima Blog |url=https://www.optima.life/blog/the-concept-of-machine-learning-applied-to-lab-resources-scheduling/ |publisher=Optima PLC Tracking Tools S.L |accessdate=17 February 2023}}</ref>


Result = 65</tt>
*A number of challenges exist in the realm of effectively and securely implementing AI in the laboratory. This includes:


:Ethical and privacy challenges<ref name=":0" /><ref name=":8" /><ref name=":10" />
:Algorithmic limitations<ref name=":11" />
:Data access limitations, including "where to get it, how to share it, and how to know when you have enough to train a machine-learning system that will produce good results"<ref name=":11" /><ref name=":8" /><ref name=":14">{{Cite web |last=Sherwood, L. |date=10 February 2022 |title=SLAS 2022: Barriers remain to AI adoption in life sciences |work=LabPulse.com Showcasts |url=https://www.labpulse.com/showcasts/slas/2022/article/15300130/slas-2022-barriers-remain-to-ai-adoption-in-life-sciences |publisher=Science and Medicine Group |accessdate=17 February 2023}}</ref><ref name=":15">{{Cite journal |last=Bellini |first=Claudia |last2=Padoan |first2=Andrea |last3=Carobene |first3=Anna |last4=Guerranti |first4=Roberto |date=2022-11-25 |title=A survey on Artificial Intelligence and Big Data utilisation in Italian clinical laboratories |url=https://www.degruyter.com/document/doi/10.1515/cclm-2022-0680/html |journal=Clinical Chemistry and Laboratory Medicine (CCLM) |language=en |volume=60 |issue=12 |pages=2017–2026 |doi=10.1515/cclm-2022-0680 |issn=1434-6621}}</ref>
:Data integration and transformation issues<ref name=":0" /><ref name=":15" />
:Regulatory barriers<ref name=":11" /><ref name=":12" />
:Misaligned incentives<ref name=":11" />
:Lack of knowledgeable/skilled talent<ref name=":0" /><ref name=":8" /><ref name=":14" /><ref name=":15" />
:Cost of skilled talent and infrastructure for maintaining and updating AI systems<ref name=":8" />
:Legacy systems running outdated technologies<ref name=":14" />
:Lack of IT systems or specialized software systems<ref name=":15" />
:Lack of standardized, best practices-based methods of validating algorithms<ref name=":8" />
:Failure to demonstrate real-world performance<ref name=":12" />
:Failure to meet the needs of the professionals using it<ref name=":12" />


These are very strong conditions, which identify the most intense dust event days over the site. Finally, we will show for one selected year (2016), the number of dust event days per month as identified by our assumptions:
*Given those challenges, some considerations should be made about implementing AI-based components in the laboratory. Examples include:


 
:'''Clinical diagnostics''': "From an industry and regulatory perspective, however, only the intended uses supported from the media manufacturer can be supported from AI applications, unless otherwise justified and substantive evidence is presented for additional claims support. This means strict adherence to specimen type and incubation conditions. Considering that the media was initially developed for human assessment using the well-trained microbiologist eye, and not an advanced imaging system with or without AI, this paradigm should shift to allow advancements in technology to challenge the status-quo of decreasing media read-times especially, as decreased read-times assist with laboratory turnaround times and thus patient management. Perhaps with an increasing body of evidence to support any proposed indications for use, either regulatory positions should be challenged, or manufacturers of media and industry AI-development specialists should work together to advance the field with new indications for use.
<tt>SELECT MONTH (‘date‘), COUNT(∗) FROM
:While the use of AI in the laboratory setting can be highly beneficial there are still some issues to be addressed. The first being phenotypically distinct single organism polymorphisms that may be interpreted by AI as separate organisms, as may also be the case for a human assessment, as well as small colony variant categorization. As detailed earlier, the broader the inputs, the greater the generalization of the model, and the higher the likelihood of algorithm accuracy. In that respect, understanding and planning around these design constraints is critical for ultimate deployment of algorithms. Additionally, expecting an AI system to correctly categorize “contamination” is a difficult task as often this again seemingly innocuous decision is dependent on years of experience and understanding the specimen type and the full clinical picture with detailed clinical histories. In this respect, a fully integrated AI-LIS system where all data is available may assist, but it is currently not possible to gather this granular detail needed to make this assessment reliable."<ref name=":7" />
 
:'''Clinical diagnostics and pathology''': "Well, if I’ve learned anything in my research into this topic, it’s that AI implementation needs to be a two-way street. First, any company who is active in this space must reach out to pathologists and laboratory medicine professionals to understand their daily workflows, needs, and pain points in as much detail as possible. Second, pathologists, laboratory medicine professionals, and educators must all play their important part – willingly offering their time and expertise when it is sought or proactively getting involved. And finally, it’s clear that there is an imbalanced focus on certain issues – with privacy, respect, and sustainability falling by the wayside."<ref name=":10">{{Cite web |last=Lee, G.F. |date=10 October 2022 |title=The Robot May See You Now: It’s time to stop and think about the ethics of artificial intelligence |work=The Pathologist |url=https://thepathologist.com/outside-the-lab/the-robot-may-see-you-now |accessdate=17 February 2023}}</ref>
(SELECT DATE (a. ‘date‘) AS ‘date‘
:'''Healthcare''': "While we are encouraged by the promise shown by AI in healthcare, and more broadly welcome the use of digital technologies in improving clinical outcomes and health system productivity, we also recognize that caution must be exercised when introducing any new healthcare technology. Working with colleagues across the NHS Transformation Directorate, as well as the wider AI community, we have been developing a framework to evaluate AI-enabled solutions in the health and care policy context. The aim of the framework is several-fold but is, at its core, a tool with which to highlight to healthcare commissioners, end users, patients and members of the public the considerations to be mindful when introducing AI to healthcare settings."<ref>{{Cite journal |last=Chada |first=Bharadwaj V |last2=Summers |first2=Leanne |date=2022-10-10 |title=AI in the NHS: a framework for adoption |url=https://www.rcpjournals.org/lookup/doi/10.7861/fhj.2022-0068 |journal=Future Healthcare Journal |language=en |pages=fhj.2022–0068 |doi=10.7861/fhj.2022-0068 |issn=2514-6645 |pmc=PMC9761451 |pmid=36561823}}</ref>
 
:'''Most any lab''': A code of AI ethics should address objectivity, privacy, transparency, accountability, and sustainability in any AI implementation.<ref name=":10" />
FROM ‘cml-aod‘ a LEFT JOIN
:'''Most any lab''': "Another approach is to implement an AI program alongside a manual process, assessing its performance along the way, as a means to ease into using the program. 'I think one of the most impactful things that laboratorians can do today is to help make sure that the lab data that they’re generating is as robust as possible, because these AI tools rely on new training sets, and their performance is really only going to be as good as the training data sets they’re given,' Stoffel said."<ref name=":8" />
 
‘cml-aod-channel‘ c ON
 
a. ‘ph‘ = c. ‘ph‘
 
AND a. ‘date‘ = c. ‘date‘ WHERE
 
station = 'Palencia' AND YEAR
 
(a. ‘date) = 2016
 
AND c. ‘wln‘ = 0.44 AND c. ‘aod‘ > 0.31
 
AND a. ‘alpha-440-870‘ < 0.55
 
AND cloud-screening-v2 = 'cloud-free'
 
GROUP BY DATE (a. ‘date‘) HAVING
 
COUNT(∗) > 10
 
ORDER BY DATE (‘date‘)) dd GROUP BY
 
month (‘date‘)</tt>
 
 
The result is shown in Fig. 12, where we can see the two peaks of occurrence of Saharan dust episodes over Spain, i.e., February–March (early spring) and May–September, basically the summer months.
 
 
[[File:Fig12 Fuertes GIMDS2018 7-1.png|521px]]
{{clear}}
{|
| STYLE="vertical-align:top;"|
{| border="0" cellpadding="5" cellspacing="0" width="521px"
|-
  | style="background-color:white; padding-left:10px; padding-right:10px;"| <blockquote>'''Figure 12.''' Number of strong Saharan dust event days for each month of the year 2016 over the Palencia AERONET site (Spain) derived from a query to the CÆLIS relational database.</blockquote>
|-
|}
|}
 
This example shows the flexibility and power of a relational database to make data analysis. Using SQL queries, very complex customized questions can be asked, and the data can be easily extracted from the database.
 
==Summary and conclusions==
Atmospheric aerosol particles are one of the most important contributors to the climate forcing uncertainty. They are currently extensively measured from ground and space, with very different techniques. It is therefore important to develop tools using modern technologies to monitor (quality control), process, analyze, and combine those data.
 
This paper has described the CÆLIS software tool, which has been developed for the management of the photometers that are calibrated and monitored by the calibration facility at the University of Valladolid, Spain, as part of AERONET. CÆLIS is intended to provide management for the photometer network, archive the data, and allow data analysis and research. Previous to the development of CÆLIS, these tasks were done manually. The use of this kind of advanced system has reduced the number of human errors and allowed one to perform more in-depth and exhaustive analysis. Thanks to CÆLIS, we are currently receiving and analyzing data from 80 sites, with a quality control system that provides flagging of the data in real time. This provides great benefits to the network management and allows immediate response to instrument malfunction.
 
The core of the CÆLIS system is built in a relational database. It stores user information (with its privileges), data, meta-data, etc. Around this database, different modules use it and offer different services: a web interface to explore the database and a NRT module to perform processing. All this software can be re-used for extending the system, for instance with other instrument types.
 
The construction of the database requires a balance between normalization and redundancy. The current system has three different layers of data. Layer 0 contains the raw data and the network management information, layer 1 contains direct products, and layer 2 contains advanced derived products that can be calculated. Each layer is based on the information of the previous one. A keystone of the system is to have a correct model of the first layer, i.e., normalized and without redundancy. This helps maintain the congruence of the system. Based on these data, other products can be developed. Depending on the use of these products, some redundancy may be necessary. For instance, pre-calculated products can allow for fast visualization in the web interface, which would be too slow if done on the fly.
 
The existence of redundancy implies that automated tasks are needed to maintain congruence. This is done by the NRT module, which organizes the actions in separated tasks. The NRT module is always running, thanks to a daemon which is based on a stack of tasks organized by priority, and that decides in every moment what must be done.
 
Users (site managers, calibration centers, researchers, etc.) can use the web interface for quick access and visualization of data. The relational database is shown to be an appropriate tool for research because it allows one to perform queries and extract data in a fast and very flexible way.
 
==Acknowledgements==
The authors gratefully acknowledge the effort of NASA to maintain the AERONET program. This research has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement no. 654109 (ACTRIS-2). The funding by MINECO (CTM2015-66742-R) and Junta de Castilla y León (VA100P17) is also acknowledged. We thank all the users of CÆLIS for their feedback, especially Emilio Cuevas, Carmen Guirado, and Roberto Román.
 
===Data availability===
All the research data used in this paper are publicly available on the AERONET website (https://aeronet.gsfc.nasa.gov/).
 
===Competing interests===  
The authors declare that they have no conflict of interest.


==References==
==References==
{{Reflist|colwidth=30em}}
{{Reflist|colwidth=30em}}
==Notes==
This presentation is faithful to the original, with only a few minor changes to presentation. Grammar was cleaned up for smoother reading. In some cases important information was missing from the references, and that information was added. The original article lists references alphabetically, but this version—by design—lists them in order of appearance. 
<!--Place all category tags here-->
[[Category:LIMSwiki journal articles (added in 2018)‎]]
[[Category:LIMSwiki journal articles (all)‎]]
[[Category:LIMSwiki journal articles on data management and sharing]]
[[Category:LIMSwiki journal articles on environmental informatics]]
[[Category:LIMSwiki journal articles on software]]

Latest revision as of 19:33, 17 February 2023

Sandbox begins below

  • Discussion and practical use of artificial intelligence (AI) in the laboratory is, perhaps to the surprise of some, not a recent phenomena. In the mid-1980s, researchers were developing computerized AI systems able "to develop automatic decision rules for follow-up analysis of [clinical laboratory] tests depending on prior information, thus avoiding the delays of traditional sequential testing and the costs of unnecessary parallel testing."[1] In fact, discussion of AI in general was ongoing even in the mid-1950s.[2][3]
  • Hiring demand for laboratorians with AI experience (2015–18) has historically been higher in non-healthcare industries, such as manufacturing, mining, and agriculture, shedding a light on how AI adoption in the clinical setting may be lacking. According to the Brookings Institute, "Even for the relatively-skilled job postings in hospitals, which includes doctors, nurses, medical technicians, research lab workers, and managers, only approximately 1 in 1,250 job postings required AI skills." They add: "AI adoption may be slow because it is not yet useful, or because it may not end up being as useful as we hope. While our view is that AI has great potential in health care, it is still an open question."[4]
  • Today, AI is being practically used in not only clinical diagnostic laboratories but also clinical research labs, life science labs, and research and development (R&D) labs, and more. Practical uses of AI can be found in:
clinical research labs[5]
hospitals[5][6]
medical diagnostics labs[6][7][8][9][10][11]
chromatography labs[11]
biology and life science labs[12]
medical imaging centers[13]
ophthalmology clinics[14]
reproduction clinics[15][16][17]
digital pathology labs[18]
material testing labs[19][20][21]
chemical experimentation and molecular discovery labs[21][22][23]
quantum physics labs[24]
  • What's going on in these labs?
Materials science: The creation of "a modular robotic platform driven by a model-based optimization algorithm capable of autonomously optimizing the optical and electronic properties of thin-film materials by modifying the film composition and processing conditions ..."[19]
Materials science: "Most of the applications of [machine learning (ML)] in chemical and materials sciences, as we have said, feature supervised learning algorithms. The goal there is to supplement or replace traditional modeling methods, at the quantum chemical or classical level, in order to predict the properties of molecules or materials directly from their structure or their chemical composition ... Our research group was applying the same idea on a narrower range of materials, trying to confirm that for a given chemical composition, geometrical descriptors of a material’s structure could lead to accurate predictions of its mechanical features."[20]
Life science: "In biological experiments, we generally cannot as easily declare victory, but we can use the systems biology approach of cycling between experimentation and modelling to see which sequences, when tested, are most likely to improve the model. In artificial intelligence, this is called active learning, and it has some similarity to the way in which we as humans learn as infants: we get some help from parents and teachers, but mainly model the world around us by exploring it and interacting with it. Ideally then, we would recreate such an environment for our machine learning algorithms in the laboratory, where we start with an initial ‘infant’ model of a certain regulatory system or protein function and let the computer decide what sequence designs to try out – a deep learning version of the ‘robot scientist’. Microbes are ideal organisms for such an approach, given the ease and speed with which they can be grown and genetically manipulated. Combined with laboratory automation, many microbial experiments can (soon) be performed with minimal human intervention, ranging from strain construction and screening, such as operated by Amyris, Gingko, Transcriptic, etc., to full-genome engineering or even the design of microbial ecologies."[12]
Digital pathology: "The collaboration combines two AI solutions, VistaPath’s Sentinel, the world’s first automated tissue grossing platform, and Gestalt’s AI Requisition Engine (AIRE), a leading-edge AI algorithm for accessioning, to raise the bar in AI-driven pathology digitization. Designed to make tissue grossing faster and more accurate, VistaPath’s Sentinel uses a high-quality video system to assess specimens and create a gross report 93% faster than human technicians with 43% more accuracy. It not only improves on quality by continuously monitoring the cassette, container, and tissue to reduce mislabeling and specimen mix-up, but also increases traceability by retaining original images for downstream review."[25]
Chemistry and molecular science: "The benefits of combining automated experimentation with a layer of artificial intelligence (AI) have been demonstrated for flow reactors, photovoltaic films, organic synthesis, perovskites and in formulation problems. However, so far no approaches have integrated mobile robotics with AI for chemical experiments. Here, we built Bayesian optimization into a mobile robotic workflow to conduct photocatalysis experiments within a ten-dimensional space."[22]
Chemistry and immunology: "Chemistry and immunology laboratories are particularly well-suited to leverage machine learning because they generate large, highly structured data sets, Schulz and others wrote in a separate review paper. Labor-intensive processes used for interpretation and quality control of electrophoresis traces and mass spectra could benefit from automation as the technology improves, they said. Clinical chemistry laboratories also generate digital images—such as urine sediment analysis—that may be highly conducive to semiautomated analyses, given advances in computer vision, the paper noted."[26]
Clinical research: "... retrospective analysis of existing patient data for descriptive and clustering purposes [and] automation of knowledge extraction, ranging from text mining, patient selection for trials, to generation of new research hypotheses ..."[5]
Clinical research: "AI ... offers a further layer to the laboratory system by analyzing all experimental data collected by experiment devices, whether it be a sensor or a collaborative robot. From data collected, AI is able to produce hypotheses and predict which combination of materials or temperature is desired for the experiment. In short, this system will allow scientists to be aided by a highly intelligent system which is constantly monitoring and analyzing the experimental output. In this way, AI will help an experiment from its inception to conclusion."[27]
Clinical research/medical diagnostics: "Artificial intelligence (AI) in the laboratory is primarily used to make sense of big data, the almost impossibly large sets of data that biologists and pharmaceutical R&D teams are accustomed to working with. AI algorithms can parse large amounts of data in a short amount of time and turn that data into visualizations that viewers can easily understand. In certain data-intensive fields, such as genomic testing and virus research, AI algorithms are the best way to sort through the data and do some of the pattern recognition work."[28]
Medical diagnostics: Development and implementation of clinical decision support systems [5][6]
Medical diagnostics: "Finally, in the laboratory, AI reduces the number of unnecessary blood samples when diagnosing infection. Instead of the 'gold standard blood sample' that takes 24-72 hours, the algorithm can predict the outcome of the blood sample with almost 80% accuracy based on demographics, vital signs, medications, and laboratory and radiology results. These are all examples of how Artificial Intelligence can be used to test better and faster with information that already exists. This saves time and costs."[10]
Medical diagnostics: "Chang sees two overarching classes of AI models: those that tackle internal challenges in the lab, such as how to deliver more accurate results to clinicians; and those that seek to identify cohorts of patients and care processes to close quality gaps in health delivery systems. The lab, however, 'isn’t truly an island,' said Michelle Stoffel, MD, PhD, associate chief medical information officer for laboratory medicine and pathology at M Health Fairview and the University of Minnesota in Minneapolis. 'When other healthcare professionals are working with electronic health records or other applications, there could be AI-driven tools, or algorithms used by an institution’s systems that may draw on laboratory data.'"[26]
Medical diagnostics: AI is used for the formulation of reference ranges, improvement of quality control, and automated interpretation of results. "Continuous monitoring of specimen acceptability, collection and transport can result in the prompt identification and correction of problems, leading to improved patient care and a reduction in unnecessary redraws and delays in reporting results."[8]
Reproduction science: "The field of AI is the marriage of humans and computers while reproductive medicine combines clinical medicine and the scientific laboratory of embryology. The application of AI has the potential to disconnect healthcare professionals from patients through algorithms, automated communication, and clinical imaging. However, in the embryology laboratory, AI, with its focus on gametes and embryos, can avoid the same risk of distancing from the patient. Areas of application of AI in the laboratory would be to enhance and automate embryo ranking through analysis of images, the ultimate goal being to predict successful implantation. Might such a trend obviate the need for embryo morphological assessment, time-lapse imaging and preimplantation genetic testing for aneuploidy (PGT-A), including mosaicism. Additionally, AI could assist with automation through analysis of testicular sperm samples searching for viable gametes, embryo grading uniformity."[15]
Chromatography-heavy sciences: " A great example of this is AI in the Liquid Chromatography Mass Spectrometry (LC-MS) field. LC-MS is a great tool used to measure various compounds in the human body, including everything from hormone levels to trace metals. One of the ways AI has already integrated with LC-MS is how it cuts down on the rate limiting steps of LC-MS, which more often than not are sample prep and LC separations. One system that Physicians Lab has made use of is parallel processing using SCIEX MPX 2.0 High Throughput System. This system can couple parallel runs with one LCMS instrument, resulting in twice the speed with no loss to accuracy. It can do this by staggering two runs either using the same method, or different methods entirely. What really makes this system great is its ability to automatically detect carryover and inject solvent blanks to clean the instrument. The system will then continue its analyzing, while automatically reinjecting samples that may be affected by the carryover. It will also flag high concentration without user input, allowing for easy detection of possibly faulty samples. This allows it to operate without users from startup to shut down. Some of the other ways that it can be used to increase efficiency are by using integrated network features to work on anything from streamlining management to increased throughput."[11]
Most any lab: "Predictive analytics, for example, is one tool that the Pistoia Alliance is using to better understand laboratory instruments and how they might fail over time... With the right data management strategies and careful consideration of metadata, how to best store data so that it can be used in future AI and ML workflows is essential to the pursuit of AI in the laboratory. Utilizing technologies such as LIMS and ELN enables lab users to catalogue data, providing context and instrument parameters that can then be fed into AI or ML systems. Without the correct data or with mismatched data types, AI and ML will not be possible, or at the very least, could provide undue bias trying to compare data from disparate sources."[29]
Most any lab: "When the actionable items are automatically created by Optima, the 'engine' starts working. An extremely sophisticated algorithm is able to assign the tasks to the resources, both laboratory personnel and instruments, according to the system configuration. Optima, thanks to a large amount of time dedicated to research the best way to automate this critical process, is able to automate most of the lab resource scheduling."[30]
  • A number of challenges exist in the realm of effectively and securely implementing AI in the laboratory. This includes:
Ethical and privacy challenges[5][26][31]
Algorithmic limitations[4]
Data access limitations, including "where to get it, how to share it, and how to know when you have enough to train a machine-learning system that will produce good results"[4][26][32][33]
Data integration and transformation issues[5][33]
Regulatory barriers[4][7]
Misaligned incentives[4]
Lack of knowledgeable/skilled talent[5][26][32][33]
Cost of skilled talent and infrastructure for maintaining and updating AI systems[26]
Legacy systems running outdated technologies[32]
Lack of IT systems or specialized software systems[33]
Lack of standardized, best practices-based methods of validating algorithms[26]
Failure to demonstrate real-world performance[7]
Failure to meet the needs of the professionals using it[7]
  • Given those challenges, some considerations should be made about implementing AI-based components in the laboratory. Examples include:
Clinical diagnostics: "From an industry and regulatory perspective, however, only the intended uses supported from the media manufacturer can be supported from AI applications, unless otherwise justified and substantive evidence is presented for additional claims support. This means strict adherence to specimen type and incubation conditions. Considering that the media was initially developed for human assessment using the well-trained microbiologist eye, and not an advanced imaging system with or without AI, this paradigm should shift to allow advancements in technology to challenge the status-quo of decreasing media read-times especially, as decreased read-times assist with laboratory turnaround times and thus patient management. Perhaps with an increasing body of evidence to support any proposed indications for use, either regulatory positions should be challenged, or manufacturers of media and industry AI-development specialists should work together to advance the field with new indications for use.
While the use of AI in the laboratory setting can be highly beneficial there are still some issues to be addressed. The first being phenotypically distinct single organism polymorphisms that may be interpreted by AI as separate organisms, as may also be the case for a human assessment, as well as small colony variant categorization. As detailed earlier, the broader the inputs, the greater the generalization of the model, and the higher the likelihood of algorithm accuracy. In that respect, understanding and planning around these design constraints is critical for ultimate deployment of algorithms. Additionally, expecting an AI system to correctly categorize “contamination” is a difficult task as often this again seemingly innocuous decision is dependent on years of experience and understanding the specimen type and the full clinical picture with detailed clinical histories. In this respect, a fully integrated AI-LIS system where all data is available may assist, but it is currently not possible to gather this granular detail needed to make this assessment reliable."[9]
Clinical diagnostics and pathology: "Well, if I’ve learned anything in my research into this topic, it’s that AI implementation needs to be a two-way street. First, any company who is active in this space must reach out to pathologists and laboratory medicine professionals to understand their daily workflows, needs, and pain points in as much detail as possible. Second, pathologists, laboratory medicine professionals, and educators must all play their important part – willingly offering their time and expertise when it is sought or proactively getting involved. And finally, it’s clear that there is an imbalanced focus on certain issues – with privacy, respect, and sustainability falling by the wayside."[31]
Healthcare: "While we are encouraged by the promise shown by AI in healthcare, and more broadly welcome the use of digital technologies in improving clinical outcomes and health system productivity, we also recognize that caution must be exercised when introducing any new healthcare technology. Working with colleagues across the NHS Transformation Directorate, as well as the wider AI community, we have been developing a framework to evaluate AI-enabled solutions in the health and care policy context. The aim of the framework is several-fold but is, at its core, a tool with which to highlight to healthcare commissioners, end users, patients and members of the public the considerations to be mindful when introducing AI to healthcare settings."[34]
Most any lab: A code of AI ethics should address objectivity, privacy, transparency, accountability, and sustainability in any AI implementation.[31]
Most any lab: "Another approach is to implement an AI program alongside a manual process, assessing its performance along the way, as a means to ease into using the program. 'I think one of the most impactful things that laboratorians can do today is to help make sure that the lab data that they’re generating is as robust as possible, because these AI tools rely on new training sets, and their performance is really only going to be as good as the training data sets they’re given,' Stoffel said."[26]

References

  1. Berger-Hershkowitz, H.; Neuhauser, D. (1987). "Artificial intelligence in the clinical laboratory". Cleveland Clinic Journal of Medicine 54 (3): 165–166. doi:10.3949/ccjm.54.3.165. ISSN 0891-1150. PMID 3301059. https://www.ccjm.org/content/54/3/165. 
  2. Minsky, M. (17 December 1956). Heuristic Aspects of the Artificial Intelligence Problem. Ed Services Technical Information Agency. https://books.google.com/books?hl=en&lr=&id=fvWNo6_IZGUC&oi=fnd&pg=PA1. Retrieved 16 February 2023. 
  3. Minsky, Marvin (1 January 1961). "Steps toward Artificial Intelligence". Proceedings of the IRE 49 (1): 8–30. doi:10.1109/JRPROC.1961.287775. ISSN 0096-8390. http://ieeexplore.ieee.org/document/4066245/. 
  4. 4.0 4.1 4.2 4.3 4.4 Goldfarb, A.; Teodoridis, F. (9 March 2022). "Why is AI adoption in health care lagging?". Series: The Economics and Regulation of Artificial Intelligence and Emerging Technologies. Brookings Institute. https://www.brookings.edu/research/why-is-ai-adoption-in-health-care-lagging/. Retrieved 17 February 2023. 
  5. 5.0 5.1 5.2 5.3 5.4 5.5 5.6 Damiani, A.; Masciocchi, C.; Lenkowicz, J.; Capocchiano, N. D.; Boldrini, L.; Tagliaferri, L.; Cesario, A.; Sergi, P. et al. (7 December 2021). "Building an Artificial Intelligence Laboratory Based on Real World Data: The Experience of Gemelli Generator". Frontiers in Computer Science 3: 768266. doi:10.3389/fcomp.2021.768266. ISSN 2624-9898. https://www.frontiersin.org/articles/10.3389/fcomp.2021.768266/full. 
  6. 6.0 6.1 6.2 University of California, San Francisco; Adler-Milstein, Julia; Aggarwal, Nakul; University of Wisconsin-Madison; Ahmed, Mahnoor; National Academy of Medicine; Castner, Jessica; Castner Incorporated et al. (29 September 2022). "Meeting the Moment: Addressing Barriers and Facilitating Clinical Adoption of Artificial Intelligence in Medical Diagnosis". NAM Perspectives 22 (9). doi:10.31478/202209c. PMC PMC9875857. PMID 36713769. https://nam.edu/meeting-the-moment-addressing-barriers-and-facilitating-clinical-adoption-of-artificial-intelligence-in-medical-diagnosis. 
  7. 7.0 7.1 7.2 7.3 Government Accountability Office (GAO); National Academy of Medicine (NAM) (September 2022). "Artificial Intelligence in Health Care: Benefits and Challenges of Machine Learning Technologies for Medical Diagnostics" (PDF). Government Accountability Office. https://www.gao.gov/assets/gao-22-104629.pdf. Retrieved 16 February 2023. 
  8. 8.0 8.1 Wen, Xiaoxia; Leng, Ping; Wang, Jiasi; Yang, Guishu; Zu, Ruiling; Jia, Xiaojiong; Zhang, Kaijiong; Mengesha, Birga Anteneh et al. (24 September 2022). "Clinlabomics: leveraging clinical laboratory data by data mining strategies" (in en). BMC Bioinformatics 23 (1): 387. doi:10.1186/s12859-022-04926-1. ISSN 1471-2105. PMC PMC9509545. PMID 36153474. https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-022-04926-1. 
  9. 9.0 9.1 DeYoung, B.; Morales, M.; Giglio, S. (4 August 2022). "Microbiology 2.0–A “behind the scenes” consideration for artificial intelligence applications for interpretive culture plate reading in routine diagnostic laboratories". Frontiers in Microbiology 13: 976068. doi:10.3389/fmicb.2022.976068. ISSN 1664-302X. PMC PMC9386241. PMID 35992715. https://www.frontiersin.org/articles/10.3389/fmicb.2022.976068/full. 
  10. 10.0 10.1 Schut, M. (1 December 2022). "Get better with bytes". Amsterdam UMC. https://www.amsterdamumc.org/en/research/news/get-better-with-bytes.htm. Retrieved 16 February 2023. 
  11. 11.0 11.1 11.2 Albano, V.; Morris, C.; Kent, T. (6 December 2019). "Calculations to Diagnosis: The Artificial Intelligence Shift That’s Already Happening". Physicians Lab. https://physicianslab.com/calculations-to-diagnosis-the-artificial-intelligence-shift-thats-already-happening/. Retrieved 16 February 2023. 
  12. 12.0 12.1 de Ridder, Dick (1 January 2019). "Artificial intelligence in the lab: ask not what your computer can do for you" (in en). Microbial Biotechnology 12 (1): 38–40. doi:10.1111/1751-7915.13317. PMC PMC6302702. PMID 30246499. https://onlinelibrary.wiley.com/doi/10.1111/1751-7915.13317. 
  13. Brandao-de-Resende, C.; Bui, M.; Daneshjou, R. et al. (11 October 2022). "AI Webinar: Clinical Adoption of AI Across Image Producing Specialties". Society for Imaging Informatics in Medicine. https://siim.org/page/22w_clinical_adoption_of_ai. 
  14. He, Mingguang; Li, Zhixi; Liu, Chi; Shi, Danli; Tan, Zachary (1 July 2020). "Deployment of Artificial Intelligence in Real-World Practice: Opportunity and Challenge" (in en). Asia-Pacific Journal of Ophthalmology 9 (4): 299–307. doi:10.1097/APO.0000000000000301. ISSN 2162-0989. https://journals.lww.com/10.1097/APO.0000000000000301. 
  15. 15.0 15.1 Trolice, Mark P.; Curchoe, Carol; Quaas, Alexander M (1 July 2021). "Artificial intelligence—the future is now" (in en). Journal of Assisted Reproduction and Genetics 38 (7): 1607–1612. doi:10.1007/s10815-021-02272-4. ISSN 1058-0468. PMC PMC8260235. PMID 34231110. https://link.springer.com/10.1007/s10815-021-02272-4. 
  16. European Society of Human Reproduction and Embryology (6 July 2022). "Annual Meeting 2022: Artificial intelligence in embryology and ART". Focus on Reproduction. https://www.focusonreproduction.eu/article/ESHRE-News-22AI. Retrieved 16 February 2023. 
  17. Hinckley, M. (17 March 2021). "Applying AI (Artificial Intelligence) in the Lab for Better IVF Success". Reproductive Science Center Blog. Reproductive Science Center of the Bay Area. https://rscbayarea.com/blog/applying-ai-for-better-ivf-success. Retrieved 16 February 2023. 
  18. Yousif, M.; McClintock, D.S.; Yao, K. (2021). "Artificial intelligence is the key driver for digital pathology adoption". Clinical Laboratory Int. PanGlobal Media. https://clinlabint.com/artificial-intelligence-is-the-key-driver-for-digital-pathology-adoption/. Retrieved 16 February 2023. 
  19. 19.0 19.1 MacLeod, B. P.; Parlane, F. G. L.; Morrissey, T. D.; Häse, F.; Roch, L. M.; Dettelbach, K. E.; Moreira, R.; Yunker, L. P. E. et al. (15 May 2020). "Self-driving laboratory for accelerated discovery of thin-film materials" (in en). Science Advances 6 (20): eaaz8867. doi:10.1126/sciadv.aaz8867. ISSN 2375-2548. PMC PMC7220369. PMID 32426501. https://www.science.org/doi/10.1126/sciadv.aaz8867. 
  20. 20.0 20.1 Chibani, Siwar; Coudert, François-Xavier (1 August 2020). "Machine learning approaches for the prediction of materials properties" (in en). APL Materials 8 (8): 080701. doi:10.1063/5.0018384. ISSN 2166-532X. http://aip.scitation.org/doi/10.1063/5.0018384. 
  21. 21.0 21.1 Mullin, R. (28 March 2021). "The lab of the future is now". Chemical & Engineering News 99 (11). Archived from the original on 06 May 2022. https://web.archive.org/web/20220506192926/http://cen.acs.org/business/informatics/lab-future-ai-automated-synthesis/99/i11. Retrieved 16 February 2023. 
  22. 22.0 22.1 Burger, Benjamin; Maffettone, Phillip M.; Gusev, Vladimir V.; Aitchison, Catherine M.; Bai, Yang; Wang, Xiaoyan; Li, Xiaobo; Alston, Ben M. et al. (9 July 2020). "A mobile robotic chemist" (in en). Nature 583 (7815): 237–241. doi:10.1038/s41586-020-2442-2. ISSN 0028-0836. https://www.nature.com/articles/s41586-020-2442-2.epdf?sharing_token=HOkIS6P5VIAo2_l3nRELmdRgN0jAjWel9jnR3ZoTv0Nw4yZPDO1jBpP52iNWHbb8TakOkK906_UHcWPTvNxCmzSMpAYlNAZfh29cFr7WwODI2U6eWv38Yq2K8odHCi-qwHcEDP18OjAmH-0KgsVgL5CpoEaQTCvbmhXDSyoGs6tIMe1nuABTeP58z6Ck3uULcdCtVQ66X244FsI7uH8GnA%3D%3D&tracking_referrer=cen.acs.org. 
  23. Lemonick, S. (6 April 2020). "Exploring chemical space: Can AI take us where no human has gone before?". Chemical & Engineering News 98 (13). Archived from the original on 29 July 2020. https://web.archive.org/web/20200729004137/https://cen.acs.org/physical-chemistry/computational-chemistry/Exploring-chemical-space-AI-take/98/i13. Retrieved 16 February 2023. 
  24. Doctrow, B. (16 December 2019). "Artificial intelligence in the laboratory". PNAS Science Sessions. https://www.pnas.org/post/podcast/artificial-intelligence-laboratory. Retrieved 16 February 2023. 
  25. VistaPath (28 July 2022). "VistaPath Launches New Collaboration with Gestalt Diagnostics to Further Accelerate Pathology Digitization". PR Newswire. Cision US Inc. https://www.prnewswire.com/news-releases/vistapath-launches-new-collaboration-with-gestalt-diagnostics-to-further-accelerate-pathology-digitization-301594718.html. Retrieved 17 February 2023. 
  26. 26.0 26.1 26.2 26.3 26.4 26.5 26.6 26.7 Blum, K. (1 January 2023). "A Status Report on AI in Laboratory Medicine". Clinical Laboratory News. American Association for Clinical Chemistry. https://www.aacc.org/cln/articles/2023/janfeb/a-status-report-on-ai-in-laboratory-medicine. Retrieved 17 February 2023. 
  27. Chubb, P. (3 November 2020). "How disruptive technology is helping laboratories combat COVID-19". Datafloq. https://datafloq.com/read/disruptive-technologies-lab-help-us-prepare-future-pandemics/. Retrieved 16 February 2023. 
  28. Stewart, B. (18 March 2021). "Using LIMS for Data Visualization". CSols Insights. CSols, Inc. https://www.csolsinc.com/insights/published-articles/using-lims-for-data-visualization/. Retrieved 17 February 2023. 
  29. "Data Analytics". Scientific Computing World - Building a Smart Laboratory 2020. Europa Science Ltd. 29 January 2021. https://www.scientific-computing.com/feature/data-analytics-0. Retrieved 17 February 2023. 
  30. Optima Team (15 December 2020). "The concept of machine learning applied to lab resources scheduling". Optima Blog. Optima PLC Tracking Tools S.L. https://www.optima.life/blog/the-concept-of-machine-learning-applied-to-lab-resources-scheduling/. Retrieved 17 February 2023. 
  31. 31.0 31.1 31.2 Lee, G.F. (10 October 2022). "The Robot May See You Now: It’s time to stop and think about the ethics of artificial intelligence". The Pathologist. https://thepathologist.com/outside-the-lab/the-robot-may-see-you-now. Retrieved 17 February 2023. 
  32. 32.0 32.1 32.2 Sherwood, L. (10 February 2022). "SLAS 2022: Barriers remain to AI adoption in life sciences". LabPulse.com Showcasts. Science and Medicine Group. https://www.labpulse.com/showcasts/slas/2022/article/15300130/slas-2022-barriers-remain-to-ai-adoption-in-life-sciences. Retrieved 17 February 2023. 
  33. 33.0 33.1 33.2 33.3 Bellini, Claudia; Padoan, Andrea; Carobene, Anna; Guerranti, Roberto (25 November 2022). "A survey on Artificial Intelligence and Big Data utilisation in Italian clinical laboratories" (in en). Clinical Chemistry and Laboratory Medicine (CCLM) 60 (12): 2017–2026. doi:10.1515/cclm-2022-0680. ISSN 1434-6621. https://www.degruyter.com/document/doi/10.1515/cclm-2022-0680/html. 
  34. Chada, Bharadwaj V; Summers, Leanne (10 October 2022). "AI in the NHS: a framework for adoption" (in en). Future Healthcare Journal: fhj.2022–0068. doi:10.7861/fhj.2022-0068. ISSN 2514-6645. PMC PMC9761451. PMID 36561823. https://www.rcpjournals.org/lookup/doi/10.7861/fhj.2022-0068.