LII:Considerations in the Automation of Laboratory Procedures

From LIMSWiki
Revision as of 19:57, 4 February 2021 by Shawndouglas (talk | contribs) (Saving and adding more.)
Jump to navigationJump to search

Title: Considerations in the Automation of Laboratory Procedures

Author for citation: Joe Liscouski

License for content: Creative Commons Attribution 4.0 International

Publication date: January 2021

Introduction

Scientists have been dealing with the issue of laboratory automation for decades, and during that time the meaning of those words has expanded from the basics of connecting an instrument to a computer, to the possibility of a fully integrated informatics infrastructure beginning with sample preparation and continuing on to the laboratory information management system (LIMS), electronic laboratory notebook (ELN), and beyond. Throughout this evolution there has been one underlying concern: how do we go about doing this?

The answer to that question has changed from a focus on hardware and programming, to today’s need for a lab-wide informatics strategy. We’ve moved from the bits and bytes of assembly language programming to managing terabytes of files and data structures.

The high-end of the problem—the large informatics database systems—has received significant industry-wide attention in the last decade. The stuff on the lab bench, while the target of a lot of individual products, has been less organized and more experimental. Failed or incompletely met promises have to yield to planned successes. How we do it needs to change. This document is about the considerations required when making that change. The haphazard "let's try this" method has to give way to more engineered solutions and a realistic appraisal of the human issues, as well as the underlying technology management and planning.

Why is this important? Whether you are conducting intense laboratory experiments to produce data and information or making chocolate chip cookies in the kitchen, two things remain important: productivity and the quality of the products. In either case, if the productivity isn’t high enough, you won’t be able to justify your work; if the quality isn’t there, no one will want what you produce. Conducting laboratory work and making cookies have a lot in common. Your laboratories exist to answer questions. What happens if I do this? What is the purity of this material? What is the structure of this compound? The field of laboratories asking these questions is extensive, basically covering the entire array of lab bench and scientific work, including chemistry, life sciences, physics, and electronics labs. The more efficiently we answer those questions, the more likely it will be that theselabs will continue operating and, that you’ll achieve the goals your organization has set. At some point, it comes down to performance against goals and the return on the investment organizations make in lab operations.

In addition to product quality and productivity, there are a number of other points that favor automation over manual implementations of lab processes. They include:

  • lower costs per test;
  • better control over expenditures;
  • a stronger basis for better workflow planning;
  • reproducibility;
  • predictably; and
  • tighter adherence to procedures, i.e., consistency.

Lists similar to the one above can be found in justifications for lab automation, and cookie production, without further comment. It’s just assumed that everyone agrees and that the reasoning is obvious. Since we are going to use those items to justify the cost and effort that goes into automation, we should take a closer look at them.

Lets begin with reproducibility, predictability, and consistency, very similar concerns that reflect automation’s ability to produce the same product with the desired characteristics over and over. For data and information, that means that the same analysis on the same materials will yield the same results, that all the steps are documented and that the process is under control. The variability that creeps into the execution of a process by people is eliminated. That variability in human labor can result from the quality of training, equipment setup and calibration, readings from analog devices (e.g., meters, pipette meniscus, charts, etc.), there is a long list of potential issues.

Concerns with reproducibility, predictability, and consistency are common to production environments, general lab work, manufacturing, and even food service. There are several pizza restaurants in our area using one of two methods of making the pies. Both start the preparation the same way, spreading dough and adding cheese and toppings, but the differences are in how they are cooked. Once method uses standard ovens (e.g., gas, wood, or electric heating); the pizza goes in, the cook watches it, and then removes it when the cooking is completed. This leads to a lot of variability in the product, some a function of the cook’s attention, some depending on requests for over or under cooking the crust. Some is based on "have it your way" customization. The second method uses a metal conveyor belt to move the pie through an oven. The oven temperature is set as is the speed of the belt, and as long as the settings are the same, you get a reproducible, consistent product order after order. It’s a matter of priorities. Manual verses automated. Consistent product quality verses how the cook feels that day. In the end, reducing variability and being able to demonstrate consistent, accurate, results gives people confidence in your product.

Lower costs per test, better control over expenditures, and better workflow planning also benefit from automation. Automated processes are more cost-efficient since the sample throughput is higher and the labor cost is reduced. The cost per test and the material usage is predictable since variability in components used in testing is reduced or eliminated, and workflow planning is improved since the time per test is known, work can be better scheduled. Additionally, process scale-up should be easier if there is a high demand for particular procedures. However there is a lot of work that has to be considered before automation is realizable, and that is where this discussion is headed.

How does this discussion relate to previous work?

This work follows on the heels of two previous works:

  • A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work (2018): This webinar series complements the above text. It begins by introducing the major topics in informatics (e.g., LIMS, ELN, etc.) and then discusses their use from a strategic viewpoint. Where and how do you start planning? What is your return on investment? What should get implemented first, and then what are my options? The series then moves on to developing an information management strategy for the lab, taking into account budgets, support, ease of implementation, and the nature of your lab’s work.

The material in this write-up picks up where the last part of the webinar series ends. The last session covers lab processes, amd this picks up that thread and goes into more depth concerning a basic issue: how do you move from manual methods to automated systems?

Productivity has always been an issue in laboratory work. Until the 1950s, a lab had little choice but to add more people if more work needed to be done. Since then, new technologies have afforded wider options, including new instrument technologies. The execution of the work was still done by people, but the tools were better. Now we have other options. We just have to figure out when, if, and how to use them.

Before we get too far into this...

With elements such as productivity, return on investment (ROI), data quality, and data integrity as driving factors in this work, you shouldn’t be surprised if a lot of the material reads like a discussion of manufacturing methodologies; we’ve already seen some examples. We are talking about scientific work, but the same things that drive the elements noted in labs have very close parallels in product manufacturing. The work we are describing here will be referenced as "scientific manufacturing," manufacturing or production in support of scientific programs.[a]

The key points of a productivity conversation in both lab and material production environments are almost exact overlays, the only significant difference is that the results of the efforts are data and information in one case, and a physical item you might sell in the other. Product quality and integrity are valued considerations in both. For scientists, this may require an adjustment to their perspectives when dealing with automation. On the plus side, the lessons learned in product manufacturing can be applied to lab bench work, making the path to implementation a bit easier while providing a framework for understanding what a successful automation effort looks like. People with backgrounds in product manufacturing can be a useful resource in the lab, with a bit of an adjustment in perspective on their part.

Transitioning from typical lab operations to automated systems

Transitioning a lab from its current state of operations to one that incorporates automation can raise a number of questions, and people’s anxiety levels. There are several questions that should be considered to set expectations for automated systems and how they will impact jobs and the introduction of new technologies. They include:

  • What will happen to people’s jobs as a result of automation?
  • What is the role of artificial intelligence (AI) and machine learning (ML) in automation?
  • Where do we find the resources to carry out automation projects/programs?
  • What equipment would we need for automated processes, and will it be different that what we currently have?
  • What role does a laboratory execution system (LES) play in laboratory automation?
  • How do we go about planning for automation?

What will happen to people’s jobs as a result of automation?

Stories are appearing in print, online, and in television news reporting about the potential for automation to replace human effort in the labor force. It seems like it is an all-or-none situation, either people will continue working in their occupations or automation (e.g., mechanical, software, AI, etc.) will replace them. The storyline is people are expensive and automated work can be less costly in the long run. If commercial manufacturing is a guide, automation is a preferred option from both a productivity and an ROI perspective. In order to make the productivity gains from automation similar to those seen in commercial manufacturing, there are some basic requirements and conditions that have to be met:

  • The process has to be well documented and understood, down to the execution of each step without variation, while error detection and recovery have to be designed in.
  • The process has to remain static and be expected to continue over enough execution cycles to make it economically attractive to design, build, and maintain.
  • Automation-compatible equipment has to be available. Custom-built components are going to be expensive and could represent a barrier to successful implementation.
  • There has to be a driving need to justify the cost of automation; economics, the volume of work that has to be addressed, working with hazardous materials, and lack of educated workers are just a few of the factors that would need to be considered.

There are places in laboratory work where production-scale automation has been successfully implemented; life sciences applications for processes based on microplate technologies are one example. When we look at the broad scope of lab work across disciplines, most lab processes don’t lend themselves to that level of automation, at least not yet. We’ll get into this in more detail later. But that brings us back to the starting point: what happens to people's jobs?

In the early stages of manufacturing automation, as well as fields such as mining where work was labor intensive and repetitive, people did lose jobs when new methods of production were introduced. That shift from a human workforce to automated task execution is expanding as system designers probe markets from retail to transportation.[1] Lower skilled occupations gave way first, and we find ourselves facing automation efforts that are moving up the skills ladder, most recently is the potential for automated driving, a technology that has yet to be fully embraced but is moving in that direction. The problem that leaves us with is providing displaced workers with a means of employment that gives them at least a living income, and the purpose, dignity, and self-worth that they’d like to have. This is going to require significant education, and people are going to have to come to grips with the realization that education never stops.

Due to the push for increased productivity, lab work has seen some similar developments in automation. The development of automated pipettes, titration stations, auto-injectors, computer-assisted instrumentation, and automation built to support microplate technologies represent just a few places where specific tasks have been addressed. However these developments haven’t moved people out of the workplace as has happened in manufacturing, mining, etc. In some cases they’ve changed the work, replacing repetitive time-consuming tasks with equipment that allows lab personnel to take on different tasks. In other cases the technology addresses work that couldn’t be performed in a cost-effective manner with human effort; without automation, that work might just not be feasible due to the volume of work (whose delivery might be limited by the availability of the right people, equipment, and facilities) or the need to work with hazardous materials. Automation may prevent the need for hiring new people while giving those currently working more challenging tasks.

As noted in the previous paragraph, much of the automation in lab work is at the task level: equipment designed to carry out a specific function such as Karl-Fisher titrations. Some equipment designed around microplate formats can function at both the task level and as part of user-integrated robotics system. This gives the planner useful options about the introduction of automation that makes it easier for personnel to get accustomed to automation before moving into scientific manufacturing.

Overall, laboratory people shouldn’t be loosing their jobs as a result of lab automation, but they do have to be open to changes in their jobs, and that could require an investment in their education. Take someone whose current job is to carry out a lab procedure, someone who understands all aspects of the work, including troubleshooting equipment, reagents, and any special problems that may crop up. Someone else may have developed the procedure, but that person is the expert in its execution.

First of all you need these experts to help plan and test the automated systems if you decide to create that project. These would also be the best people to educate as automated systems managers; they know how the process is supposed to work and should be in a position to detect problems. If it crashes, you’ll need someone who can cover the work while problems are be addressed. Secondly, if lab personnel get the idea that they are watching their replacement being installed, they may leave before the automated systems are ready. In the event of a delay, you’ll have a backlog and no one to handle it.

Beyond that, people will be freed from the routine of carrying out processes and be able to address work that had been put on a back burner until it could be addressed. As we move toward automated systems, jobs will change by expansion to accommodate typical lab work, as well as the management, planning, maintenance, and evolution of laboratory automation and computing.

Automation in lab work is not an "all or none" situation. Processes can be structured so that the routine work is done by systems, and the analyst can spend time reviewing the results, looking for anomalies and interesting patterns, while being able to make decisions about the need for and nature of follow-on efforts.

What is the role of AI and ML in automation?

When we discuss automation, what we are referencing now is basic robotics and programming. AI may, and likely will, play a role in the work, but first we have to get the foundations right before we consider the next step; we need to put in the human intelligence first. Part of the issue with AI is that we don’t know what it is.

Science fiction aside, many of today's applications of AI have a limited role in lab work today. Here are some examples:

  • Having a system that can bring up all relevant information on a research question—a sort of super Google—or a variation of IBM’s Watson could have significant benefits.
  • Analyzing complex data or large volumes of data could be beneficial, e.g., the analysis of radio astronomy data to find fast radio bursts (FRB). After discovering 21 FRB signals upon analyzing five hours of data, researchers at Green Bank Telescope used AI to analyze 400 terabytes of older data and detected another 100.[2]
  • "[A] team at Glasgow University has paired a machine-learning system with a robot that can run and analyze its own chemical reaction. The result is a system that can figure out every reaction that's possible from a given set of starting materials."[3]
  • HelixAI is using Amazon's Alexa as a digital assitant for laboratory work.[4]

Note that the points above are research-based applications, not routine production environments where regulatory issues are important. While there are research applications that might be more forgiving of AI systems because the results are evaluated by human intelligence, and problematic results can be made subject to further verification, data entry systems such as voice entry have to be carefully tested and the results of that data entry verified and shown to be correct.

Pharma IQ continues to publish material on advanced topics in laboratory informatics, including articles on how labs are benefiting from new technologies[5] and survey reports such as AI 2020: The Future of Drug Discovery. In that report they note[6]:

  • "94% of pharma professionals expect that intelligent technologies will have a noticeable impact on the pharmaceutical industry over the next two years."
  • "Almost one fifth of pharma professionals believe that we are on the cusp of a revolution."
  • "Intelligent automation and predictive analytics are expected to have the most significant impact on the industry."
  • "However, a lack of understanding and awareness about the benefits of AI-led technologies remain a hindrance to their implementation."

Note that these are expectations, not a reflection of current reality. That same report makes comments about the impact of AI on headcount disruption, asking, "Do you expect intelligent enterprise technologies[b] to significantly cut and/or create jobs in pharma through 2020?" Among the responses, 47 percent said they expected those technologies to do both, 40 percent said it will create new job opportunities, and 13 percent said there will be no dramatic change, with zero percent saying they expected solely job losses.[6]

While there are high levels of expectations and hopes for results, we need to approach the idea of AI in labs with some caution. We read about examples based on machine learning (ML), for example using computer systems to recognize cats in photos, to recognized faces in a crowd, etc. We don’t know how they accomplish their tasks, and we can’t analyze their algorithms and decision-making. That leaves us with testing in quality, which at best is an uncertain process with qualified results (it has worked so far). One problem with testing AI systems based on ML is that they are going to continually evolve, so testing may affect the ML processes by providing a bias. It may also cause continued, redundant testing, because something we thought was evaluated was changed by the “experiences” the AI based it’s learning on. As one example, could the AI modify the science through process changes without our knowing because it didn’t understand the science or the goals of the work?

AI is a black box with ever-changing contents. That shouldn’t be taken as a condemnation of AI in the lab, but rather as a challenge to human intelligence in evaluating, proving, and applying the technology. That application includes defining the operating boundaries of an AI system. Rather than creating a master AI for a complete process, we may elect to divide the AI’s area of operation into multiple, independent segments, with segment integration occurring in later stages once we are confident in their ability to work and show clear evidence of systems stability. In all of this we need to remember that our goal is the production of high-quality data and information in a controlled, predictable environment, not gee-wiz technology. One place where AI (or clever programming) could be of use is in better workflow planning, which takes into account current workloads and assignments, factors in the inevitable panic-level testing need, and, perhaps in a QC/production environment, anticipates changes in analysis requirements based on changes in production operations.

Throughout this section I've treated “AI” as “artificial intelligence,” its common meaning. There may be a better way of looking at it for lab use as, noted in this excerpt from the October 2018 issue of Wired magazine[7]:

Augmented intelligence. Not “artificial,” but how Doug Engelbart[c] envisioned our relationship with computer: AI doesn’t replace humans. It offers idiot-savant assistants that enable us to become the best humans we can be.

Augmented intelligence (AuI) is a better term for what we might experience in lab work, at least in the near future. It suggests something that is both more realistic and attainable, with the synergism that would make it, and automation, attractive to lab management and personnel—a tool they can work with and improve lab operations that doesn’t carry the specter of something going on that they don’t understand or control. OPUS/SEARCH from Bruker might be just such an entry in this category.[8] AuI may serve as a first-pass filter for large data sets—as noted in the radio astronomy and chemistry examples noted earlier—reducing those sets of data and information to smaller collections that human intelligence can/should evaluate. However, that does put a burden on the AuI to avoid excessive false positives or negatives, something that can be adjusted over time.

Beyond that there is the possibility of more cooperative work between people and AuI systems. An article in Scientific American titled “My Boss the Robot”[9] describes the advantage of a human-robot team, with the robot doing the heavy work and the human—under the robots guidance—doing work he was more adept at, verses a team of experts with the same task. The task, welding a Humvee frame, was competed by the human machine pair in 10 hours at a cost of $1,150; the team of experts took 89 hours and a cost of $7,075. That might translate into terms of laboratory work by having a robot do routine, highly repetitive tasks and the analyst overseeing the operation and doing higher-level analysis of the results.


Footnotes

  1. The term "scientific manufacturing" was first mentioned to the author by Mr. Alberto Correia, then of Cambridge Biomedical, Boston, MA.
  2. Intelligent enterprise technologies referenced in the report include robotic process automation, machine learning, artificial intelligence, the internet Of things, predictive analysis, and cognitive computing.
  3. Doug Engelbart found the field of human-computer interaction and is credited with the invention of the computer mouse, and the “Mother of All Demos” in 1968.

About the author

Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation/computing professional with over forty years experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.

References

  1. Frey, C.B.; Osborne, M.A. (17 September 2013). "The Future of Employment: How Susceptible Are Jobs to Computerisation?" (PDF). Oxford Martin School, University of Oxford. https://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf. Retrieved 04 February 2021. 
  2. Hsu, J. (24 September 2018). "Is it aliens? Scientists detect more mysterious radio signals from distant galaxy". NBC News MACH. https://www.nbcnews.com/mach/science/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586. Retrieved 04 February 2021. 
  3. Timmer, J. (18 July 2018). "AI plus a chemistry robot finds all the reactions that will work". Ars Technica. https://arstechnica.com/science/2018/07/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work/5/. Retrieved 04 February 2021. 
  4. "HelixAI - Voice Powered Digital Laboratory Assistants for Scientific Laboratories". HelixAI. http://www.askhelix.io/. Retrieved 04 February 2021. 
  5. PharmaIQ News (20 August 2018). "Automation, IoT and the future of smarter research environments". PharmaIQ. https://www.pharma-iq.com/pre-clinical-discovery-and-development/news/automation-iot-and-the-future-of-smarter-research-environments. Retrieved 04 February 2021. 
  6. 6.0 6.1 PharmaIQ (14 November 2017). "The Future of Drug Discovery: AI 2020". PharmaIQ. https://www.pharma-iq.com/pre-clinical-discovery-and-development/whitepapers/the-future-of-drug-discovery-ai-2020. Retrieved 04 February 2021. 
  7. Template:Cite web journal
  8. "OPUS Package: SEARCH & IDENT". Bruker Corporation. https://www.bruker.com/en/products-and-solutions/infrared-and-raman/opus-spectroscopy-software/search-identify.html. Retrieved 04 February 2021. 
  9. Bourne, D. (2013). "My Boss the Robot". Scientific American 308 (5): 38–41. doi:10.1038/scientificamerican0513-38. PMID 23627215.