Difference between revisions of "LII:Considerations in the Automation of Laboratory Procedures"
Shawndouglas (talk | contribs) (Saving and adding more.) |
Shawndouglas (talk | contribs) |
||
Line 190: | Line 190: | ||
The next step is to decide what improvements you are looking for in an automated system: increased throughput, lower cost of operation, the ability to off-load human work, reduced variability, etc. In short, what are your goals? | The next step is to decide what improvements you are looking for in an automated system: increased throughput, lower cost of operation, the ability to off-load human work, reduced variability, etc. In short, what are your goals? | ||
That brings us to the matter of project planning. We’re not going to go into a lot of depth in this piece about project planning, as there are a number of references{{efn|See for example https://www.projectmanager.com/project-planning; the simplest thing to do it put “project planning” in a search engine and browse the results for something interesting.}} on the subject, including material produced by the former Institute for Laboratory Automation.{{efn|See for example https:// | That brings us to the matter of project planning. We’re not going to go into a lot of depth in this piece about project planning, as there are a number of references{{efn|See for example https://www.projectmanager.com/project-planning; the simplest thing to do it put “project planning” in a search engine and browse the results for something interesting.}} on the subject, including material produced by the former Institute for Laboratory Automation.{{efn|See for example https://theinformationdrivenlaboratory.wordpress.com/category/resources/; note that any references to the ILA should be ignored as the original site is gone, with the domain name perhaps having been leased by another organization that has no affiliation with the original Institute for Laboratory Automation.}} There are some aspects of the subject that we do need to touch on, however, and they include: | ||
* justifying the project and setting expectations and goals; | * justifying the project and setting expectations and goals; | ||
Line 207: | Line 207: | ||
* Predictability: The time to completion for a given test is more predictable in automated programs; once the process starts it keeps going without interruptions that can be found in human centered activities | * Predictability: The time to completion for a given test is more predictable in automated programs; once the process starts it keeps going without interruptions that can be found in human centered activities | ||
* Tighter adherence to procedures: Automated procedures have no choice but to be consistent in procedure execution; that is what programming and automation is about. | * Tighter adherence to procedures: Automated procedures have no choice but to be consistent in procedure execution; that is what programming and automation is about. | ||
==Footnotes== | ==Footnotes== |
Revision as of 23:17, 4 February 2021
Title: Considerations in the Automation of Laboratory Procedures
Author for citation: Joe Liscouski
License for content: Creative Commons Attribution 4.0 International
Publication date: January 2021
This article should be considered a work in progress and incomplete. Consider this article incomplete until this notice is removed. |
Introduction
Scientists have been dealing with the issue of laboratory automation for decades, and during that time the meaning of those words has expanded from the basics of connecting an instrument to a computer, to the possibility of a fully integrated informatics infrastructure beginning with sample preparation and continuing on to the laboratory information management system (LIMS), electronic laboratory notebook (ELN), and beyond. Throughout this evolution there has been one underlying concern: how do we go about doing this?
The answer to that question has changed from a focus on hardware and programming, to today’s need for a lab-wide informatics strategy. We’ve moved from the bits and bytes of assembly language programming to managing terabytes of files and data structures.
The high-end of the problem—the large informatics database systems—has received significant industry-wide attention in the last decade. The stuff on the lab bench, while the target of a lot of individual products, has been less organized and more experimental. Failed or incompletely met promises have to yield to planned successes. How we do it needs to change. This document is about the considerations required when making that change. The haphazard "let's try this" method has to give way to more engineered solutions and a realistic appraisal of the human issues, as well as the underlying technology management and planning.
Why is this important? Whether you are conducting intense laboratory experiments to produce data and information or making chocolate chip cookies in the kitchen, two things remain important: productivity and the quality of the products. In either case, if the productivity isn’t high enough, you won’t be able to justify your work; if the quality isn’t there, no one will want what you produce. Conducting laboratory work and making cookies have a lot in common. Your laboratories exist to answer questions. What happens if I do this? What is the purity of this material? What is the structure of this compound? The field of laboratories asking these questions is extensive, basically covering the entire array of lab bench and scientific work, including chemistry, life sciences, physics, and electronics labs. The more efficiently we answer those questions, the more likely it will be that theselabs will continue operating and, that you’ll achieve the goals your organization has set. At some point, it comes down to performance against goals and the return on the investment organizations make in lab operations.
In addition to product quality and productivity, there are a number of other points that favor automation over manual implementations of lab processes. They include:
- lower costs per test;
- better control over expenditures;
- a stronger basis for better workflow planning;
- reproducibility;
- predictably; and
- tighter adherence to procedures, i.e., consistency.
Lists similar to the one above can be found in justifications for lab automation, and cookie production, without further comment. It’s just assumed that everyone agrees and that the reasoning is obvious. Since we are going to use those items to justify the cost and effort that goes into automation, we should take a closer look at them.
Lets begin with reproducibility, predictability, and consistency, very similar concerns that reflect automation’s ability to produce the same product with the desired characteristics over and over. For data and information, that means that the same analysis on the same materials will yield the same results, that all the steps are documented and that the process is under control. The variability that creeps into the execution of a process by people is eliminated. That variability in human labor can result from the quality of training, equipment setup and calibration, readings from analog devices (e.g., meters, pipette meniscus, charts, etc.), there is a long list of potential issues.
Concerns with reproducibility, predictability, and consistency are common to production environments, general lab work, manufacturing, and even food service. There are several pizza restaurants in our area using one of two methods of making the pies. Both start the preparation the same way, spreading dough and adding cheese and toppings, but the differences are in how they are cooked. Once method uses standard ovens (e.g., gas, wood, or electric heating); the pizza goes in, the cook watches it, and then removes it when the cooking is completed. This leads to a lot of variability in the product, some a function of the cook’s attention, some depending on requests for over or under cooking the crust. Some is based on "have it your way" customization. The second method uses a metal conveyor belt to move the pie through an oven. The oven temperature is set as is the speed of the belt, and as long as the settings are the same, you get a reproducible, consistent product order after order. It’s a matter of priorities. Manual verses automated. Consistent product quality verses how the cook feels that day. In the end, reducing variability and being able to demonstrate consistent, accurate, results gives people confidence in your product.
Lower costs per test, better control over expenditures, and better workflow planning also benefit from automation. Automated processes are more cost-efficient since the sample throughput is higher and the labor cost is reduced. The cost per test and the material usage is predictable since variability in components used in testing is reduced or eliminated, and workflow planning is improved since the time per test is known, work can be better scheduled. Additionally, process scale-up should be easier if there is a high demand for particular procedures. However there is a lot of work that has to be considered before automation is realizable, and that is where this discussion is headed.
How does this discussion relate to previous work?
This work follows on the heels of two previous works:
- Computerized Systems in the Modern Laboratory: A Practical Guide (2015): This book presents the range of informatics technologies, their relationship to each other, and the role they play in laboratory work. It differentiates a LIMS from an ELN and scientific data management system (SDMS) for example, contrasting their use and how they would function in different lab working environments. In addition, it covers topics such as support and regulatory issues.
- A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work (2018): This webinar series complements the above text. It begins by introducing the major topics in informatics (e.g., LIMS, ELN, etc.) and then discusses their use from a strategic viewpoint. Where and how do you start planning? What is your return on investment? What should get implemented first, and then what are my options? The series then moves on to developing an information management strategy for the lab, taking into account budgets, support, ease of implementation, and the nature of your lab’s work.
The material in this write-up picks up where the last part of the webinar series ends. The last session covers lab processes, amd this picks up that thread and goes into more depth concerning a basic issue: how do you move from manual methods to automated systems?
Productivity has always been an issue in laboratory work. Until the 1950s, a lab had little choice but to add more people if more work needed to be done. Since then, new technologies have afforded wider options, including new instrument technologies. The execution of the work was still done by people, but the tools were better. Now we have other options. We just have to figure out when, if, and how to use them.
Before we get too far into this...
With elements such as productivity, return on investment (ROI), data quality, and data integrity as driving factors in this work, you shouldn’t be surprised if a lot of the material reads like a discussion of manufacturing methodologies; we’ve already seen some examples. We are talking about scientific work, but the same things that drive the elements noted in labs have very close parallels in product manufacturing. The work we are describing here will be referenced as "scientific manufacturing," manufacturing or production in support of scientific programs.[a]
The key points of a productivity conversation in both lab and material production environments are almost exact overlays, the only significant difference is that the results of the efforts are data and information in one case, and a physical item you might sell in the other. Product quality and integrity are valued considerations in both. For scientists, this may require an adjustment to their perspectives when dealing with automation. On the plus side, the lessons learned in product manufacturing can be applied to lab bench work, making the path to implementation a bit easier while providing a framework for understanding what a successful automation effort looks like. People with backgrounds in product manufacturing can be a useful resource in the lab, with a bit of an adjustment in perspective on their part.
Transitioning from typical lab operations to automated systems
Transitioning a lab from its current state of operations to one that incorporates automation can raise a number of questions, and people’s anxiety levels. There are several questions that should be considered to set expectations for automated systems and how they will impact jobs and the introduction of new technologies. They include:
- What will happen to people’s jobs as a result of automation?
- What is the role of artificial intelligence (AI) and machine learning (ML) in automation?
- Where do we find the resources to carry out automation projects/programs?
- What equipment would we need for automated processes, and will it be different that what we currently have?
- What role does a laboratory execution system (LES) play in laboratory automation?
- How do we go about planning for automation?
What will happen to people’s jobs as a result of automation?
Stories are appearing in print, online, and in television news reporting about the potential for automation to replace human effort in the labor force. It seems like it is an all-or-none situation, either people will continue working in their occupations or automation (e.g., mechanical, software, AI, etc.) will replace them. The storyline is people are expensive and automated work can be less costly in the long run. If commercial manufacturing is a guide, automation is a preferred option from both a productivity and an ROI perspective. In order to make the productivity gains from automation similar to those seen in commercial manufacturing, there are some basic requirements and conditions that have to be met:
- The process has to be well documented and understood, down to the execution of each step without variation, while error detection and recovery have to be designed in.
- The process has to remain static and be expected to continue over enough execution cycles to make it economically attractive to design, build, and maintain.
- Automation-compatible equipment has to be available. Custom-built components are going to be expensive and could represent a barrier to successful implementation.
- There has to be a driving need to justify the cost of automation; economics, the volume of work that has to be addressed, working with hazardous materials, and lack of educated workers are just a few of the factors that would need to be considered.
There are places in laboratory work where production-scale automation has been successfully implemented; life sciences applications for processes based on microplate technologies are one example. When we look at the broad scope of lab work across disciplines, most lab processes don’t lend themselves to that level of automation, at least not yet. We’ll get into this in more detail later. But that brings us back to the starting point: what happens to people's jobs?
In the early stages of manufacturing automation, as well as fields such as mining where work was labor intensive and repetitive, people did lose jobs when new methods of production were introduced. That shift from a human workforce to automated task execution is expanding as system designers probe markets from retail to transportation.[1] Lower skilled occupations gave way first, and we find ourselves facing automation efforts that are moving up the skills ladder, most recently is the potential for automated driving, a technology that has yet to be fully embraced but is moving in that direction. The problem that leaves us with is providing displaced workers with a means of employment that gives them at least a living income, and the purpose, dignity, and self-worth that they’d like to have. This is going to require significant education, and people are going to have to come to grips with the realization that education never stops.
Due to the push for increased productivity, lab work has seen some similar developments in automation. The development of automated pipettes, titration stations, auto-injectors, computer-assisted instrumentation, and automation built to support microplate technologies represent just a few places where specific tasks have been addressed. However these developments haven’t moved people out of the workplace as has happened in manufacturing, mining, etc. In some cases they’ve changed the work, replacing repetitive time-consuming tasks with equipment that allows lab personnel to take on different tasks. In other cases the technology addresses work that couldn’t be performed in a cost-effective manner with human effort; without automation, that work might just not be feasible due to the volume of work (whose delivery might be limited by the availability of the right people, equipment, and facilities) or the need to work with hazardous materials. Automation may prevent the need for hiring new people while giving those currently working more challenging tasks.
As noted in the previous paragraph, much of the automation in lab work is at the task level: equipment designed to carry out a specific function such as Karl-Fisher titrations. Some equipment designed around microplate formats can function at both the task level and as part of user-integrated robotics system. This gives the planner useful options about the introduction of automation that makes it easier for personnel to get accustomed to automation before moving into scientific manufacturing.
Overall, laboratory people shouldn’t be loosing their jobs as a result of lab automation, but they do have to be open to changes in their jobs, and that could require an investment in their education. Take someone whose current job is to carry out a lab procedure, someone who understands all aspects of the work, including troubleshooting equipment, reagents, and any special problems that may crop up. Someone else may have developed the procedure, but that person is the expert in its execution.
First of all you need these experts to help plan and test the automated systems if you decide to create that project. These would also be the best people to educate as automated systems managers; they know how the process is supposed to work and should be in a position to detect problems. If it crashes, you’ll need someone who can cover the work while problems are be addressed. Secondly, if lab personnel get the idea that they are watching their replacement being installed, they may leave before the automated systems are ready. In the event of a delay, you’ll have a backlog and no one to handle it.
Beyond that, people will be freed from the routine of carrying out processes and be able to address work that had been put on a back burner until it could be addressed. As we move toward automated systems, jobs will change by expansion to accommodate typical lab work, as well as the management, planning, maintenance, and evolution of laboratory automation and computing.
Automation in lab work is not an "all or none" situation. Processes can be structured so that the routine work is done by systems, and the analyst can spend time reviewing the results, looking for anomalies and interesting patterns, while being able to make decisions about the need for and nature of follow-on efforts.
What is the role of AI and ML in automation?
When we discuss automation, what we are referencing now is basic robotics and programming. AI may, and likely will, play a role in the work, but first we have to get the foundations right before we consider the next step; we need to put in the human intelligence first. Part of the issue with AI is that we don’t know what it is.
Science fiction aside, many of today's applications of AI have a limited role in lab work today. Here are some examples:
- Having a system that can bring up all relevant information on a research question—a sort of super Google—or a variation of IBM’s Watson could have significant benefits.
- Analyzing complex data or large volumes of data could be beneficial, e.g., the analysis of radio astronomy data to find fast radio bursts (FRB). After discovering 21 FRB signals upon analyzing five hours of data, researchers at Green Bank Telescope used AI to analyze 400 terabytes of older data and detected another 100.[2]
- "[A] team at Glasgow University has paired a machine-learning system with a robot that can run and analyze its own chemical reaction. The result is a system that can figure out every reaction that's possible from a given set of starting materials."[3]
- HelixAI is using Amazon's Alexa as a digital assitant for laboratory work.[4]
Note that the points above are research-based applications, not routine production environments where regulatory issues are important. While there are research applications that might be more forgiving of AI systems because the results are evaluated by human intelligence, and problematic results can be made subject to further verification, data entry systems such as voice entry have to be carefully tested and the results of that data entry verified and shown to be correct.
Pharma IQ continues to publish material on advanced topics in laboratory informatics, including articles on how labs are benefiting from new technologies[5] and survey reports such as AI 2020: The Future of Drug Discovery. In that report they note[6]:
- "94% of pharma professionals expect that intelligent technologies will have a noticeable impact on the pharmaceutical industry over the next two years."
- "Almost one fifth of pharma professionals believe that we are on the cusp of a revolution."
- "Intelligent automation and predictive analytics are expected to have the most significant impact on the industry."
- "However, a lack of understanding and awareness about the benefits of AI-led technologies remain a hindrance to their implementation."
Note that these are expectations, not a reflection of current reality. That same report makes comments about the impact of AI on headcount disruption, asking, "Do you expect intelligent enterprise technologies[b] to significantly cut and/or create jobs in pharma through 2020?" Among the responses, 47 percent said they expected those technologies to do both, 40 percent said it will create new job opportunities, and 13 percent said there will be no dramatic change, with zero percent saying they expected solely job losses.[6]
While there are high levels of expectations and hopes for results, we need to approach the idea of AI in labs with some caution. We read about examples based on machine learning (ML), for example using computer systems to recognize cats in photos, to recognized faces in a crowd, etc. We don’t know how they accomplish their tasks, and we can’t analyze their algorithms and decision-making. That leaves us with testing in quality, which at best is an uncertain process with qualified results (it has worked so far). One problem with testing AI systems based on ML is that they are going to continually evolve, so testing may affect the ML processes by providing a bias. It may also cause continued, redundant testing, because something we thought was evaluated was changed by the “experiences” the AI based it’s learning on. As one example, could the AI modify the science through process changes without our knowing because it didn’t understand the science or the goals of the work?
AI is a black box with ever-changing contents. That shouldn’t be taken as a condemnation of AI in the lab, but rather as a challenge to human intelligence in evaluating, proving, and applying the technology. That application includes defining the operating boundaries of an AI system. Rather than creating a master AI for a complete process, we may elect to divide the AI’s area of operation into multiple, independent segments, with segment integration occurring in later stages once we are confident in their ability to work and show clear evidence of systems stability. In all of this we need to remember that our goal is the production of high-quality data and information in a controlled, predictable environment, not gee-wiz technology. One place where AI (or clever programming) could be of use is in better workflow planning, which takes into account current workloads and assignments, factors in the inevitable panic-level testing need, and, perhaps in a QC/production environment, anticipates changes in analysis requirements based on changes in production operations.
Throughout this section I've treated “AI” as “artificial intelligence,” its common meaning. There may be a better way of looking at it for lab use as, noted in this excerpt from the October 2018 issue of Wired magazine[7]:
Augmented intelligence. Not “artificial,” but how Doug Engelbart[c] envisioned our relationship with computer: AI doesn’t replace humans. It offers idiot-savant assistants that enable us to become the best humans we can be.
Augmented intelligence (AuI) is a better term for what we might experience in lab work, at least in the near future. It suggests something that is both more realistic and attainable, with the synergism that would make it, and automation, attractive to lab management and personnel—a tool they can work with and improve lab operations that doesn’t carry the specter of something going on that they don’t understand or control. OPUS/SEARCH from Bruker might be just such an entry in this category.[8] AuI may serve as a first-pass filter for large data sets—as noted in the radio astronomy and chemistry examples noted earlier—reducing those sets of data and information to smaller collections that human intelligence can/should evaluate. However, that does put a burden on the AuI to avoid excessive false positives or negatives, something that can be adjusted over time.
Beyond that there is the possibility of more cooperative work between people and AuI systems. An article in Scientific American titled “My Boss the Robot”[9] describes the advantage of a human-robot team, with the robot doing the heavy work and the human—under the robots guidance—doing work he was more adept at, verses a team of experts with the same task. The task, welding a Humvee frame, was competed by the human machine pair in 10 hours at a cost of $1,150; the team of experts took 89 hours and a cost of $7,075. That might translate into terms of laboratory work by having a robot do routine, highly repetitive tasks and the analyst overseeing the operation and doing higher-level analysis of the results.
Certainly, AI/AuI is going to change over time as programming and software technology becomes more sophisticated and capable; today’s example of AuI might be seen as tomorrow’s clever software. However, a lot depends on the experience of the user.
There is something important to ask about laboratory technology development, and AI in particular: is the direction of development going to be the result of someone’s innovation that people look at and embrace, or will it be the result of a deliberate choice of lab people saying “this is where we need to go, build systems that will get us there”? The difference is important, and lab managers and personnel need to be in control of the planning and implementation of systems.
Where do we find the resources to carry out automation projects/programs?
Given the potential scope of work, you may need people with skills in programming, robotics, instrumentation, and possibly mechanical or electrical engineering if off-the-shelf components aren’t available. The biggest need is for people who can do the planning and optimization that is needed as you move from manual to semi- or fully-automated systems, particularly specialists in process engineering who can organize and plan the work, including the process controls and provision for statistical process control.
We need to develop people who are well versed in laboratory work and the technologies that can be applied to that work, as assets in laboratory automation development and planning. In the past, this role has been filled with lab personnel having an interest in the subject, IT people willing to extend their responsibilities, and/or outside consultants. A 2017 report by Salesforce Research states "77% of IT leaders believe IT functions as an extension/partner of business units rather than as a separate function."[10] The report makes no mention of laboratory work or manufacturing aside from those being functions within businesses surveyed. Unless a particular effort is made, IT personnel rarely have the backgrounds needed to meet the needs of lab work. In many cases, they will try and fit lab needs into software they are already familiar with, rather then extend their backgrounds into new computational environments. Office and pure database applications are easily handled, but when we get to the lab bench, it's another matter entirely.
The field is getting complex enough that we need people whose responsibilities span both science and technology. This subject is discussed in the webinar series A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work, Part 5 "Supporting Laboratory Systems."
What equipment would we need for automated processes, and will it be different that what we currently have?
This is an interesting issue and it directly addresses the commitment labs have to automation, particularly robotics. In the early days of lab automation when Zymark (Zymate and Benchmate), Perkin Elmer, and Hewlett Packard (ORCA) were the major players in the market, the robot had to adapt to equipment that was designed for human use: standard laboratory equipment. They did that through special modifications and the use of different grippers to handle test tubes, beakers, and flasks. While some companies wanted to test the use of robotics in the lab, they didn’t want to invest in equipment that could only be used with robots; they wanted lab workers to pick up where the robots left off in case the robots didn’t work.
Since then, equipment has evolved to support automation more directly. In some cases it is a device (e.g., a balance, pH meter, etc.) that has front panel human operator capability and rear connectors for computer communications. Liquid handling systems have seen the most advancement through the adoption of microplate formats and equipment designed to work with them. However, the key point is standardization of the sample containers. Vials and microplates lend themselves to a variety of automation devices, from sample processing to auto-injectors/samplers. The issue is getting the samples into those formats.
One point that labs, in any scientific discipline, have to come to grips with is the commitment to automation. That commitment isn’t going to be done on a lab-wide basis, but on a procedure-by-procedure basis. Full automation may not be appropriate for all lab work, whereas partial automation may be a better choice, and in some cases no automation may be required (we’ll get into that later). The point that needs to be addressed is the choice of equipment. In most cases, equipment is designed for use by people, with options for automation and electronic communications. However, if you want to maximize throughput, you may have to follow examples from manufacturing and commit to equipment that is only used by automation. That will mean a redesign of the equipment, a shared risk for both the vendors and the users. The upside to this is that equipment can be specifically designed for a task, be more efficient, have the links needed for integration, use less material, and, more likely, take up less space. One example is the microplate, allowing for tens, hundreds, or thousands (depending on the plate used) of sample cells in a small space. What used to take many cubic feet of space as test tubes (the precursor to using microplates) is now a couple of cubic inches, using much less material and working space. Note, however, that while microplates are used by lab personnel, their use in automated systems provides greater efficiency and productivity.
The idea of equipment used only in an automated process isn’t new. The development and commercialization of segmented flow analyzers—initially by Technicon in the form of the AutoAnalyzers for general use, and the SMA (Sequential Multiple Analyzer) and SMAC (Sequential Multiple Analyzer with Computer) in clinical markets—improved a lab's ability to process samples. These systems were phased out with new equipment that consumed less material. Products like these are being provided by Seal Analytical[11] for environmental work and Bran+Luebbe (a division of SPX Process Equipment in Germany).[12]
The issue in committing to automated equipment is that vendors and users will have to agree on equipment specifications and use them within procedures. One place this has been done successfully is in clinical chemistry labs. What other industry workflows could benefit? Do the vendors lead or do the users drive the issue? Vendors need to be convinced that there is a viable market for product before making an investment, and users need to be equally convinced that they will succeed in applying those products. In short, procedures that are important to a particular industry have to be identified, and both users and vendors have to come together to develop automated procedure and equipment specifications for products. This has been done successfully in clinical chemistry markets to the extent that equipment is marketed for use as validated for particular procedures.
What role does a LES play in laboratory automation?
Before ELNs settled into their current role in laboratory work, the initial implementations differed considerably from what we have now. LabTech Notebook was released in 1986 (discontinued in 2004) to provide communications between computers and devices that used RS-232 serial communications. In the early 2000s SmartLab from Velquest was the first commercial product to carry the "electronic laboratory notebook" identifier. That product became a stand-alone entry in the laboratory execution system (LES) market; since its release, the same conceptual functionality has been incorporated into LIMS and ELNs that fit the more current expectation for an ELN.
At it’s core, LES are scripted test procedures that an analyst would follow to carry out a laboratory method, essentially functioning as the programmed execution of a lab process. Each step in a process is described, followed exactly, and provision is made within the script for data collection. In addition, the LES can/will (depending on the implementation; "can" in the case of SmartLab) check to see if the analyst is qualified to carry out the work and that the equipment and reagents are current, calibrated, and suitable for use. The systems can also have access to help files that an analyst can reference if there are questions about how to carry out a step or resolve issues. Beyond that, the software had the ability to work with lab instruments and automatically acquire data either through direct interfaces (e.g., balances, pH meters, etc.) or through parsing PDF files of instrument reports.
There are two reasons that these systems are attractive. First, they provide for a rigorous execution of a process with each step being logged as it is done. Second, that log provides a regulatory inspector with documented evidence that the work was done properly, making it easier for the lab to meet any regulatory burden.
Since the initial development of SmartLab, that product has changed ownership and is currently in the hands of Dassault Systèmes as part of the BIOVIA product line. As noted above, LIMS and ELN vendors have incorporated similar functionality into their products. Using those features requires “scripting” (in reality, software development), but it does allow the ability to access the database structures within those products. The SmartLab software needed programmed interfaces to other vendors' LIMS and ELNs to gain access to the same information.
What does this have to do with automation?
When we think about automated systems, particularly full-automation with robotic support, it is a programmed process from start to finish. The samples are introduced at the start, and the process continues until the final data/information is reported and stored. These can be large scale systems using microplate formats, including tape-based systems from Douglas Scientific[13], programmable autosamplers such as those from Agilent[14], or systems built around robotics arms from a variety of vendors that move samples from one station to another.
Both LES and the automation noted in the previous paragraph have the following point in common: there is a strict process that must be followed, with no provision for variation. The difference is that in one case that process is implemented completely through the use of computers, as well as electronic and mechanical equipment. In the other case, the process is being carried out by lab personnel using computers, as well as electronic and mechanical lab equipment. In essence, people take the place of mechanical robots, which conjures up all kinds of images going back to the 1927 film Metropolis.[d] Though the LES represents a step toward more sophisticated automation, both methods still require:
- programming, including “scripting” (the LES methods are a script that has to be followed);
- validated, proven processes; and
- qualified staff, though the qualifications differ. (In both cases they have to be fully qualified to carry out the process in question. However in the full automation case, they will require more education on running, managing, and troubleshooting the systems.)
In the case of full automation, there has to be sufficient justification for the automation of the process, including sufficient sample processing. The LES-human implementation can be run for a single sample if needed, and the operating personnel can be trained on multiple procedures, switching tasks as needed. Electro-mechanical automation would require a change in programming, verification that the system is operating properly, and may require equipment re-configuration. Which method is better for a particular lab depends on trade-offs between sample load, throughput requirements, cost, and flexibility. People are adaptable, easily moving between tasks, whereas equipment has to be adapted to a task.
How do we go about planning for automation?
There are three forms of automation to be considered:
- No automation – Instead, the lab relies on lab personnel to carry out all steps of a procedure.
- Partial automation – Automated equipment is used to carry out steps in a procedure. Given the current state of laboratory systems, this is the most prevalent since most lab equipment has computer components in them to facilitate their use.
- Full automation - The entire process is automated. The definition of “entire” is open to each labs interpretation and may vary from one process to another. For example, some samples may need some handing before they are suitable for use in a procedure. That might be a selection process from a freezer, grinding materials prior to a solvent extraction, and so on, representing cases where the equipment available isn’t suitable for automated equipment interaction. One goal is to minimize this effort since it can put a limit on the productivity of the entire process. This is also an area where negotiation between the lab and the sample submitter can be useful. Take plastic pellets for example, which often need to be ground into a course powder before they can be analyzed; having the submitter provide them in this form will reduce the time and cost of the analysis. Standardizing on the sample container can also facilitate the analysis (having the lab provide the submitter with standard sample vials using barcodes or RFID chips can streamline the process).
One common point that these three forms share is a well-described method (procedure, process) that needs to be addressed. That method should be fully developed, tested, and validated. This is the reference point for evaluating any form of automation (Figure 1).
|
The documentation for the chosen method should include the bulleted list of items from Figure 1, as they describe the science aspects of the method. The last four points are important. The method should be validated since the manual procedure is a reference point for determining if the automated system is producing useful results. The reproducibility metric offers a means of evaluating at least one expected improvement in an automated system; you’d expect less variability in the results. This requires a set of reference sample materials that can be repeatedly evaluated to compare the manual and automated systems, and to periodically test the methods in use to ensure that there aren’t any trends developing that would compromise the method’s use. Basically, this amounts to statistical quality control on the processes.
The next step is to decide what improvements you are looking for in an automated system: increased throughput, lower cost of operation, the ability to off-load human work, reduced variability, etc. In short, what are your goals?
That brings us to the matter of project planning. We’re not going to go into a lot of depth in this piece about project planning, as there are a number of references[e] on the subject, including material produced by the former Institute for Laboratory Automation.[f] There are some aspects of the subject that we do need to touch on, however, and they include:
- justifying the project and setting expectations and goals;
- analyzing the process;
- scheduling; and
- budgeting.
Justification, expectations, and goals
Basically why are you doing this, what do you expect to gain? What arguments are you going to use to justify the work and expense involved in the project? How will you determine if the project is successful?
Fundamentally, automation efforts are about productivity and the bulleted items noted in the introduction of this piece, repeated below with additional commentary:
- Lower costs per test, and better control over expenditure: These can result from a reduction in labor and materials costs, including more predictable and consistent reagent usage per test.
- Stronger basis for better workflow planning: Informatics systems can provide better management over workloads and resource allocation, while key performance indicators can show where bottlenecks are occurring or if samples are taking too long to process. These can be triggers for procedure automation to improve throughput.
- Reproducibility: The test results from automated procedures can be expected to be more reproducible by eliminating the variability that is typical of steps executed by people. Small variation in dispensing reagents, for example, could be eliminated.
- Predictability: The time to completion for a given test is more predictable in automated programs; once the process starts it keeps going without interruptions that can be found in human centered activities
- Tighter adherence to procedures: Automated procedures have no choice but to be consistent in procedure execution; that is what programming and automation is about.
Footnotes
- ↑ The term "scientific manufacturing" was first mentioned to the author by Mr. Alberto Correia, then of Cambridge Biomedical, Boston, MA.
- ↑ Intelligent enterprise technologies referenced in the report include robotic process automation, machine learning, artificial intelligence, the internet Of things, predictive analysis, and cognitive computing.
- ↑ Doug Engelbart found the field of human-computer interaction and is credited with the invention of the computer mouse, and the “Mother of All Demos” in 1968.
- ↑ See Metropolis (1927 film) on Wikipedia.
- ↑ See for example https://www.projectmanager.com/project-planning; the simplest thing to do it put “project planning” in a search engine and browse the results for something interesting.
- ↑ See for example https://theinformationdrivenlaboratory.wordpress.com/category/resources/; note that any references to the ILA should be ignored as the original site is gone, with the domain name perhaps having been leased by another organization that has no affiliation with the original Institute for Laboratory Automation.
About the author
Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation/computing professional with over forty years experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.
References
- ↑ Frey, C.B.; Osborne, M.A. (17 September 2013). "The Future of Employment: How Susceptible Are Jobs to Computerisation?" (PDF). Oxford Martin School, University of Oxford. https://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf. Retrieved 04 February 2021.
- ↑ Hsu, J. (24 September 2018). "Is it aliens? Scientists detect more mysterious radio signals from distant galaxy". NBC News MACH. https://www.nbcnews.com/mach/science/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586. Retrieved 04 February 2021.
- ↑ Timmer, J. (18 July 2018). "AI plus a chemistry robot finds all the reactions that will work". Ars Technica. https://arstechnica.com/science/2018/07/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work/5/. Retrieved 04 February 2021.
- ↑ "HelixAI - Voice Powered Digital Laboratory Assistants for Scientific Laboratories". HelixAI. http://www.askhelix.io/. Retrieved 04 February 2021.
- ↑ PharmaIQ News (20 August 2018). "Automation, IoT and the future of smarter research environments". PharmaIQ. https://www.pharma-iq.com/pre-clinical-discovery-and-development/news/automation-iot-and-the-future-of-smarter-research-environments. Retrieved 04 February 2021.
- ↑ 6.0 6.1 PharmaIQ (14 November 2017). "The Future of Drug Discovery: AI 2020". PharmaIQ. https://www.pharma-iq.com/pre-clinical-discovery-and-development/whitepapers/the-future-of-drug-discovery-ai-2020. Retrieved 04 February 2021.
- ↑ Rossetto, L. (2018). "Fight the Dour". Wired (October): 826–7. https://www.magzter.com/stories/Science/WIRED/Fight-The-Dour.
- ↑ "OPUS Package: SEARCH & IDENT". Bruker Corporation. https://www.bruker.com/en/products-and-solutions/infrared-and-raman/opus-spectroscopy-software/search-identify.html. Retrieved 04 February 2021.
- ↑ Bourne, D. (2013). "My Boss the Robot". Scientific American 308 (5): 38–41. doi:10.1038/scientificamerican0513-38. PMID 23627215.
- ↑ SalesForce Research (2017). "Second Annual State of IT" (PDF). SalesForce. https://a.sfdcstatic.com/content/dam/www/ocms/assets/pdf/misc/2017-state-of-it-report-salesforce.pdf. Retrieved 04 February 2021.
- ↑ "Seal Analytical - Products". Seal Analytical. https://seal-analytical.com/Products/tabid/55/language/en-US/Default.aspx. Retrieved 04 February 2021.
- ↑ "Bran+Luebbe". SPX FLOW, Inc. https://www.spxflow.com/bran-luebbe/. Retrieved 04 February 2021.
- ↑ "Array Tape Advanced Consumable". Douglas Scientific. https://www.douglasscientific.com/Products/ArrayTape.aspx. Retrieved 04 February 2021.
- ↑ "Agilent 1200 Series Standard and Preparative Autosamplers - User Manual" (PDF). Agilent Technologies. November 2008. https://www.agilent.com/cs/library/usermanuals/Public/G1329-90012_StandPrepSamplers_ebook.pdf. Retrieved 04 February 2021.