Journal:Public health informatics, human factors, and the end-users

From LIMSWiki
Jump to navigationJump to search
Full article title Public health informatics, human factors, and the end-users
Journal Health Services Research and Managerial Epidemiology
Author(s) Matthews, Sarah D.; Proctor, Michael D.
Author affiliation(s) University of Central Florida
Primary contact Email: sarah dot matthews at knights dot ucf dot edu
Year published 2021
Volume and issue 8
Article # 23333928211012226
DOI 10.1177/23333928211012226
ISSN 2333-3928
Distribution license Creative Commons Attribution-NonCommercial 4.0 International
Download (PDF)


There is an unspoken assumption in public health informatics that “if you build it, they will come.” In this commentary, we argue that building it is not enough. Without end-user focus on human factors issues that are identified and resolved prior to implementation, “they may come, but they won’t stay!” We argue that to engage public health professionals with new innovative technology, one must continually ask during the development process, “who are we building this product for, and do we have the right information to back up our theories on implementation and use?” With the myriad of public health informatics introduced amid the COVID-19 pandemic, there are now many choices. For those languishing, we note that this question may not have been sufficiently pursued, resulting in situations where “they may come, but they won’t stay!”

Keywords: public health informatics, public health professionals, Technology Acceptance Model, Health Belief Model, human factors research

Public health informatics evolution and COVID-19

Over two decades ago, Yasnoff et al.[1] defined the discipline of public health informatics (PHI) as the “systematic application of information and computer science and technology to public health practice, research, and learning.”[1] The prevailing impression at that time was that all stakeholders must be engaged in coordinated activities related to PHI, but the public health workforce were deemed to not have the training and experience to make decisions about IT.[2] If public health professionals were left out of the PHI decision making, is it any surprise that this practice resulted in some high risk level of technology failures, and in other cases slow adoption of the technology.[2] Besides bringing public health professionals into the decision making circle, what can be done to help PHI flourish?

In the last two decades, we have seen a myriad of innovative technologies applied to public health practice in areas such as disease surveillance, immunization registries, electronic health record (EHR) integration, vital statistics, etc. Public health organizations such as the Association of State and Territorial Health Officials (ASTHO), Public Health Informatics Institute (PHII), and Centers for Disease Control and Prevention (CDC) recognized that the decentralized United States public health system is enormous in scale and immense in diversity, making it difficult to implement innovative technology successfully.[3][4][5] Complicating successful implementations are informatics factors and organizational culture barriers that do not allow for evidence-based public health to be implemented in practice.[2][6]

To help public health leaders to understand these digital technologies and make informed decisions on technology integration, ASTHO, PHII, and the CDC have created several reports and frameworks.[7][8][9] These documents are filled with standards, discussion about security, confidentiality and privacy, system architectures and infrastructure, training, and workforce development. They are detailed about the technology and their implementations, with recommendations of such things as overhauling of computer systems, changes in operability, and upgrades to hardware and software. These guides and frameworks primarily define the workforce in the domain of technical skills. One example is the CDC’s roadmap for public health informatics and data modernization, which identifies the need for a future workforce with stronger skills in data science, analytics, modeling, and informatics.

Clearly, improved technical skills are a necessary condition, but we argue that alone is insufficient to the challenge at hand. Kaplan and Harris-Salamone[10] report that across industries (including healthcare) there is at least a 40% or greater failure rate for generic IT projects. These failures are largely attributed to overbudgeting, timeline overruns, under delivery of value, and termination of the project before completion. They also emphasize the three major reasons for project success: user involvement, proactive executive management support, and a clear requirements statement.[10] Reviewing the CDC’s roadmap reveals an insufficient emphasis on—if not a complete miss of—these important human factors concepts, such as the perceptions of the public health professional workforce, their attitude, and their motivation for accepting and using new technology.[7]

With a transdisciplinary approach, we studied public health professionals across the United States to understand their technology use behavior and their health behavior toward the use of an agent-based online personalized intelligent tutoring system.[11] Thus, we believe our findings can be extrapolated and should be applied with prior evidence-based interventions to increase innovative technology project success and retention for public health practice applications.

Health factors research in public health practice

In our study, we reaffirmed findings of prior studies that the biggest barriers to the user were time restraints and technology barriers such as firewalls not allowing cloud-based applications, slow loading, system compatibility, specific state requirements, and interoperability across devices.[10][11] But by combining the theoretical frameworks of the Public Health Services Health Belief Model (HBM)[12] and Davis’ Technology Acceptance Model (TAM)[13] we also discovered less emphasized insights.

HBM hypothesized that health related-action depends upon three factors occurring simultaneously[14][15]:

  1. the existence of sufficient motivation to make the health issue relevant;
  2. the belief that someone is susceptible to a serious health problem or the sequelae of that illness or condition (i.e., perceived threat); and
  3. the belief that following the health recommendation/regime would be beneficial in reducing the perceived threat.

HBM is composed of four constructs: perceived susceptibility, perceived severity/seriousness, perceived benefits to taking action, and perceived barriers to taking action. These constructs are applied to the individual’s cues to action.[14][15] Our results revealed that public health professionals were sufficiently motivated by the health-related state posed, they believe that their community was susceptible to a serious health problem, sequelae from that condition, and that using the technology would be beneficial in reducing the threat of illness to the community. But the most influential construct in the HBM was various cues to action. Thus, when developing new technology, public health professionals must believe that use of the new technology will improve their confidence in the work they do. Technology influencers were others from the public health domain, including colleagues. Finally, we note that technology must be taught in a self-paced environment to achieve success.

TAM is another important technique widely used in industries outside health care, accounting for 30 to 40% of IT acceptance.[16][17] TAM is composed of four concepts: attitude, perceived ease of use, perceived usefulness, and intention for use. Of these concepts, attitude and perceived ease of use were found to be the most influential for actual use. End users’ attitude is measured in thinking that it is a good idea to use the technology, liking the idea of using it, and finding use of the technology a pleasant experience. The most influential toward use of new technology in our study was perceived ease of use. Perceived ease of use is measured by how easy the technology is to operate, how well the technology does what is expected, how clear and understandable it is to use, how flexible it is to interact with, and how easy it is to become skillful at using the technology.

What does this mean and how do we apply it? Consider two COVID-19 case management tool examples: one using the MITRE Sara Alert product and the other Microsoft ARIAS/Dynamics. MITRE boasts of its development in partnership with key public health partner organizations, with much focus on the technical and functionality aspects of the tool.[18] Currently, eight states have implemented the system to help with contact tracing efforts for COVID-19, reducing the staffing and resources needed to conduct active monitoring.[9] However, there remains the burden of considerable workload with enrolling contacts, direct monitoring of non-participatory contacts, and follow-up on non-responders, as well as having to do duplicative data entry into existing state data systems.[19] Customization is limited, which creates operational issues across states. Cases are purged two weeks after isolation, and quarantine orders are closed, thus leaving states to develop a process to export data to retain for historical metrics.[18]

Microsoft’s ARIAS/Dynamics has been implemented by nine States.[9] Oregon Health acknowledges in their contact tracing training that the software requires technical skills and access to equipment. Additionally, because of the limitation of the English-language-only option, they are co-creating a system that serves other demographics in their state.[20] The system also requires the Firefox and Chrome browsers, as it is not fully supported by Explorer or Safari, the two browsers most frequently used in governmental public health.[20] In the limited documentation on these systems, there is no mention of technology acceptance, usability, or ease of use. There is no published literature on how suitable the end-users feel the technology is for their jobs. This lack of human factors research leaves one to believe that the implementation of these novel technologies is merely reactionary, and after the COVID-19 response, the investment in these automated systems will likely be left to waste like so many other IT projects.

IT projects in the public health domain cannot continue to slight human factors; they should be proactive, with a focus on not only technology aspects of the project but also using the aforementioned techniques to focus one’s approach to human factor implementation. Public health informatics leaders cannot continue to only account for public health professionals in the workforce development sections of their implementation agenda. These end-users must be included in the structural research prior to implementation. Human factors research theories and concepts must be included in the frameworks and guides, otherwise these innovative approaches will likely continue to lead to the abysmal high percentage of technology failures.


Although we critique the current process of public health informatics implementation, we do believe that the myriad of projects introduced amid the COVID-19 response can be sustained and accepted after the response. As such, we recommend the following:

  1. Developers, public health informatics leaders, and scientists must collaborate, including human factors researchers and the public health end-user in discussions.
  2. Those stakeholders should collect data with theoretically informed and empirically validated tools on the end-user’s perceptions, attitude, and motivation for using the new technology and their acceptance of its use.[11]
  3. Those stakeholders must also document enhancements of, fixes to, barriers to, and best practices for using the technology during implementation.
  4. Finally, they must review and analyze relevant data to help create clear technology requirements statements for future projects.



The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Article processing charges were provided in part by the UCF College of Graduate Studies Open Access Publishing Fund.

Conflict of interest

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.


  1. 1.0 1.1 Yasnoff, William A.; OʼCarroll, Patrick W.; Koo, Denise; Linkins, Robert W.; Kilbourne, Edwin M. (2000). "Public Health Informatics: Improving and Transforming Public Health in the Information Age:" (in en). Journal of Public Health Management and Practice 6 (6): 67–75. doi:10.1097/00124784-200006060-00010. ISSN 1078-4659. 
  2. 2.0 2.1 2.2 Yasnoff, W. A.; Overhage, J. M.; Humphreys, B. L.; LaVenture, M. (1 November 2001). "A National Agenda for Public Health Informatics: Summarized Recommendations from the 2001 AMIA Spring Congress" (in en). Journal of the American Medical Informatics Association 8 (6): 535–545. doi:10.1136/jamia.2001.0080535. ISSN 1067-5027. PMC PMC130064. PMID 11687561. 
  3. Beck, Angela J.; Boulton, Matthew L.; Coronado, Fátima (1 November 2014). "Enumeration of the Governmental Public Health Workforce, 2014" (in en). American Journal of Preventive Medicine 47 (5): S306–S313. doi:10.1016/j.amepre.2014.07.018. PMC PMC6944190. PMID 25439250. 
  4. Hilliard, Tracy M.; Boulton, Matthew L. (1 May 2012). "Public Health Workforce Research in Review" (in en). American Journal of Preventive Medicine 42 (5): S17–S28. doi:10.1016/j.amepre.2012.01.031. 
  5. Tao, Donghua; Evashwick, Connie J.; Grivna, Michal; Harrison, Roger (19 February 2018). "Educating the Public Health Workforce: A Scoping Review". Frontiers in Public Health 6: 27. doi:10.3389/fpubh.2018.00027. ISSN 2296-2565. PMC PMC5826052. PMID 29515988. 
  6. Novick, Lloyd F. (1 September 2020). "JPHMP and The Guide to Community Preventive Services" (in en). Journal of Public Health Management and Practice 26 (5): 399–400. doi:10.1097/PHH.0000000000001217. ISSN 1078-4659. 
  7. 7.0 7.1 Centers for Disease Control and Prevention (9 April 2021). "Data Modernization Initiative". Public Health Surveillance and Data. Centers for Disease Control and Prevention. 
  8. ASTHO (July 2020). "Issue Guide: COVID-19 Case Investigation and Contact Tracing - Considerations for Using Digital Technologies" (PDF). ASTHO. 
  9. 9.0 9.1 9.2 Public Health Informatics Institute (June 2020). "Digital Tools to Support Contact Tracing: Tool Assessment Report" (PDF). Public Health Informatics Institute. 
  10. 10.0 10.1 10.2 Kaplan, B.; Harris-Salamone, K. D. (1 May 2009). "Health IT Success and Failure: Recommendations from Literature and an AMIA Workshop" (in en). Journal of the American Medical Informatics Association 16 (3): 291–299. doi:10.1197/jamia.M2997. ISSN 1067-5027. PMC PMC2732244. PMID 19261935. 
  11. 11.0 11.1 11.2 Matthews, S.D.; Proctor, M.D. (2021). "Can Public Health Workforce Competency and Capacity Be Built through an Agent-Based Online, Personalized Intelligent Tutoring System?". Educational Technology & Society 24 (1): 29–43. 
  12. Rosenstock, Irwin M. (1 December 1974). "Historical Origins of the Health Belief Model" (in en). Health Education Monographs 2 (4): 328–335. doi:10.1177/109019817400200403. ISSN 0073-1455. 
  13. Davis, Fred D. (1 September 1989). "Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology". MIS Quarterly 13 (3): 319–340. doi:10.2307/249008. 
  14. 14.0 14.1 Rosenstock, Irwin M.; Strecher, Victor J.; Becker, Marshall H. (1 June 1988). "Social Learning Theory and the Health Belief Model" (in en). Health Education Quarterly 15 (2): 175–183. doi:10.1177/109019818801500203. ISSN 0195-8402. 
  15. 15.0 15.1 University of Twente (2004). "3.2 Health Belief Model". Communication Theories. University of Twente. pp. 37–40. 
  16. Holden, Richard J.; Karsh, Ben-Tzion (1 February 2010). "The Technology Acceptance Model: Its past and its future in health care" (in en). Journal of Biomedical Informatics 43 (1): 159–172. doi:10.1016/j.jbi.2009.07.002. PMC PMC2814963. PMID 19615467. 
  17. Legris, Paul; Ingham, John; Collerette, Pierre (1 January 2003). "Why do people use information technology? A critical review of the technology acceptance model" (in en). Information & Management 40 (3): 191–204. doi:10.1016/S0378-7206(01)00143-4. 
  18. 18.0 18.1 MITRE Corporation (2020). "Sara Alert". MITRE Partnership Network. 
  19. Krueger, Anna; Gunn, Jayleen K. L.; Watson, Joanna; Smith, Andrew E.; Lincoln, Rebecca; Huston, Sara L.; Dirlikov, Emilio; Robinson, Sara (7 August 2020). "Characteristics and Outcomes of Contacts of COVID-19 Patients Monitored Using an Automated Symptom Monitoring Tool — Maine, May–June 2020". MMWR. Morbidity and Mortality Weekly Report 69 (31): 1026–1030. doi:10.15585/mmwr.mm6931e2. ISSN 0149-2195. PMC PMC7454893. PMID 32759918. 
  20. 20.0 20.1 Oregon Health Authority (24 August 2020). "ARIAS System Training" (PDF). Oregon Health Authority. 


This presentation is faithful to the original, with only a few minor changes to presentation. Some grammar and paragraph spacing was updated for improved readability. In some cases important information was missing from the references, and that information was added.