Journal:Technology transfer and true transformation: Implications for open data

From LIMSWiki
Revision as of 20:02, 2 July 2018 by Shawndouglas (talk | contribs) (Saving and adding more.)
Jump to navigationJump to search
Full article title Technology transfer and true transformation: Implications for open data
Journal Data Science Journal
Author(s) Bezuidenhout, Louise
Author affiliation(s) University of Oxford
Primary contact Email: louise dot bezuidenhout at insis dot ox dot ac dot uk
Year published 2017
Volume and issue 16
Page(s) 26
DOI 10.5334/dsj-2017-026
ISSN 1683-1470
Distribution license Creative Commons Attribution 4.0 International
Website https://datascience.codata.org/articles/10.5334/dsj-2017-026/
Download https://datascience.codata.org/articles/10.5334/dsj-2017-026/galley/678/download/ (PDF)

Abstract

When considering the “openness” of data, it is unsurprising that most conversations focus on the online environment—how data is collated, moved, and recombined for multiple purposes. Nonetheless, it is important to recognize that the movements online are only part of the data lifecycle. Indeed, considering where and how data are created—namely, the research setting—are of key importance to open data initiatives. In particular, such insights offer key understandings of how and why scientists engage with in practices of openness, and how data transitions from personal control to public ownership.

This paper examines research settings in low/middle-income countries (LMIC) to better understand how resource limitations influence open data buy-in. Using empirical fieldwork in Kenyan and South African laboratories, it draws attention to some key issues currently overlooked in open data discussions. First, many of the hesitations raised by the scientists about sharing data were as much tied to the speed of their research as to any other factor. Thus, it would seem that the longer it takes for individual scientists to create data, the more hesitant they are about sharing it. Second, the pace of research is a multifaceted bind involving many different challenges relating to laboratory equipment and infrastructure. Indeed, it is unlikely that one single solution (such as equipment donation) will ameliorate these “binds of pace.” Third, these “binds of pace” were used by the scientists to construct “narratives of exclusion” through which they remove themselves from responsibility for data sharing.

Using an adapted model of technology first proposed by Elihu Gerson, the paper then offers key ways in which these critical “binds of pace” can be addressed in open data discourse. In particular, it calls for an expanded understanding of laboratory equipment and research speed to include all aspects of the research environment. It also advocates for better engagement with LMIC scientists regarding these challenges and the adoption of frugal/responsible design principles in future open data initiatives.

Keywords: technology, low/middle-income countries, data sharing, research, pace

Introduction

The issue of increasing the openness of data online is a global priority. Indeed, open data is increasingly featuring on agendas of both high- and low/middle-income country development plans.[1] Nevertheless, data sharing in low/middle-income countries (LMICs) is challenged by a number of widely-recognized issues. These include a lack of resources for sharing activities[2] as well as for research activities more generally. Strategically increasing research capacity in LMICs—and thus the ability of LMIC researchers to participate in the open data movement—is intrinsically tied (at least in part) to the need for increasing the availability of laboratory and ICT equipment.

Unpacking the links between laboratory equipment and open data

It is recognized that the lack of up-to-date laboratory equipment hampers not only the ability to conduct certain types of research, but has an overall impact on the pace and efficiency of research. How to best address this lack of physical research resources is becoming a topic for directed intervention, and a number of different organizations have been set up to address issues relating to equipment provision. These include databases of equipment[a], equipment donation schemes[b], or equipment collaborations, as well as increased equipment budgets in many funded grants.[c]

Despite the value of these initiatives, a coordinated and sustained approach to research equipment in LMICs remains elusive for two key reasons. First, a lack of empirical evidence detailing the contextual heterogeneity of LMIC research environments challenges targeted interventions. Second, the absence of LMIC scientists in more general discussions on scientific research practices makes it difficult to pinpoint key issues that may be prevalent within these research settings. Thus, capacity building initiatives are often challenged by the absence of a clear picture of what equipment are needed and best deployed in LMIC regions. It is therefore highly possible that other interventions are critically needed if this resource shortfall is to be effectively addressed.

The challenges of increasing research capacity through equipment-related interventions have far-reaching implications for LMIC research. In this special edition, and in related papers[3][4][5], we argue for a stronger connection between the discussions of open data and the research environment in which data are generated. The physical—as well as the social and regulatory aspects of research environments—influences how scientists are able to create, curate, and disseminate data, and thus the ability of scientists to contribute and re-use data online. Moreover—and often overlooked—the characteristics and challenges of personal research environments can influence the importance that scientists attach to the open data movement.[3][4][5]

Nonetheless, in many discussions on open data there is an absence of robust discussion on the influence of the physical research environment on data engagement activities. This paper examines this issue in more detail examining four interlinking questions. First, to what extent do issues relating to technology affect the pace of research in these laboratories? Second, could these issues of pace be ameliorated by the directed provision of more equipment—particularly high-level, specialized machinery? Moreover, how can reflecting on issues to do with technology contribute towards more inclusive discussion surrounding open data? Finally, how can a better understanding of research technologies enable more contextually-sensitive discussions about data engagement?

In order to unpack these questions in detail, the paper discusses qualitative fieldwork conducted in four African laboratories between 2014 and 2015. This fieldwork was designed to investigate data engagement activities among scientists working in resource-limited environments. From these interviews, the paper highlights how issues of data engagement and issues of equipment provision were inextricably intertwined and often interdependent. If these issues are to be effectively addressed in open data discussions, the paper suggests that an expanded definition of “research technologies” is necessary. Using a model proposed by Elihu Gerson, the paper then offers key ways in which the critical issues of technological contextuality can be effectively implemented into open data discourse.

It's not just the equipment

When considering laboratory equipment and research it is tempting to make the assumption that more—and newer—equipment leads to more productive research that is conducted at a faster speed with increased outputs (such as data). Indeed, such assumptions drive many of the equipment-focused initiatives mentioned above. Similarly, it is tempting to extend such assumptions to open data conversations. If more equipment will facilitate the faster production of increased amounts of data, the argument would go, then scientists will be more able (and willing) to share their data online.

While these arguments make a compelling case, examination of the current status quo indicates a need for caution. Indeed, if the causal links between equipment provision, increased research pace, and improved open outputs were that straightforward, data sharing should be markedly increased by the provision of (any) laboratory equipment. Such questions motivated a period of embedded fieldwork in Kenya and South Africa between 2014 and 2015. I wanted to examine how scientists in low-resourced research settings engaged in open data activities and discussions—and whether their physical laboratory environment had any influence over this engagement.[d] Over the course of the year I spent three to six weeks in four different chemistry laboratories and conducted 56 semi-structured interviews with researchers and postgraduate students to find out what was working in their research environments, and what challenged their ability to generate, curate, store, share, and re-use data online.

Upon analyzing the interviews, the issue of pace in research was unavoidable. Indeed, it was everywhere. Concerns about the slowness of research, and the pressure to speed it up, pervaded how the scientists talked about their research, valued their data, identified threats to their sovereignty and acquisition of credit, positioned themselves within the scientific community, and evaluated the international community’s efforts to assist them. These issues have been discussed in other papers[3][4][5] and will not be covered here. Instead, this paper takes a step back to look at why there was this overwhelming awareness of pace in these laboratories. What aspects of the laboratory equipment played key roles in controlling the pace of research, and consequentially the engagement of scientists in open data activities.

The equipment is ...

The laboratories that I visited were not members of high-profile consortia or integrated into well-funded foreign research networks. Rather, they were good examples of home-grown science. They produced high-quality research but were dependent on their funding from multiple national and international sources. Moreover, their facilities—and the budget to maintain or upgrade them—were provided by their host institutions. This created a bind for the researchers, as the facilities provided were often minimal and/or badly maintained, and their institutions did not have large amounts of “core funding” for upgrades. As one Kenyan participant said:

We get no funding from the government. We get paid from the government, we get bills of power and water by the government but otherwise, other than that, the materials that we need for research we have to source from funding agencies. (KY1:8)

Similarly, as most of the funding for their research came from project-specific grants, the researchers had few opportunities to secure money for standard laboratory equipment or general laboratory maintenance. A participant in South Africa eloquently said, when talking about her research that:

[it] is a challenge because the university doesn’t offer a start-up fund for equipment. … I would need to pay bit by bit and one by one. When I have funding then buy one piece of equipment and maybe after five years I would have my lab. (SA2:11)

Moreover, even when the money was there, many of the participants said that they experienced problems accessing it, or using it to address the challenges that they identified in their daily research environment. This is evident in a quote by another South African participant who said:

It’s really bad – the bureaucracy of it. It’s how the money is transferred, technical services, procurement, all those … but those are like “grand problems” that you can’t solve. (SA2:6)

Thus, a lot of the discussions I had about research and data engagement became discussions about equipment and research environments. The researchers I interviewed highlighted a number of key issues that affected the pace of their research in comparison (in their opinion) to well-resourced laboratories. In particular, the statements related to the “un-usability” of the equipment that was available for them to use. These statements are broadly grouped under the headings below.

... not there ...

One of the most common complaints I heard in all four laboratories was that the equipment available for research curtailed the types of research that could be done by the researchers. While this is, of course, an issue for scientists around the world, for many of the researchers that I interviewed this was almost a deal-breaking aspect of their research plans. As one Kenyan participant observed:

the lack of equipment limits the extent to which you can do research – and even the type of research that you want to do. And you ask yourself, ok, so I want to do this kind of research but do I have the machinery? (KY2:3)

Similarly, a participant from the other Kenyan site said:

[o]ur labs are not even there for synthesis – synthetic work – the environment is not there. So when it comes to that I either have to skip it or I have to go to a lab that has such facilities. (KY1:3)

These constraints not only shaped the research being conducted in these environments, but they also necessitated that a number of researchers change the direction of their research in order to fit in with the equipment available. Particularly in Kenya there were a number of lecturers and professors who had done postgraduate training in the U.K. or U.S., but they were unable to capitalize on their research experiences back home. This was described by one Kenyan professor who said:

the kind of research which is taking place here is a bit different from what I was doing – like in the UK I was doing synthetic organic chemistry. And the kind of equipment and the rest, it was purely on silicone chemistry and the reagents and the rest I couldn’t get them here. So what I had to do was to look for things which are relevant for this institution. (KY1:1)

In addition to shaping the types—and thus the broad pace—of research, the lack of equipment also had an impact on the daily pace of research activities in the laboratory. This is evident in the exchange below, where the participant (a postgraduate student) explains day-to-day practices within the laboratory. In particular, he highlights how sharing basic equipment plays a highly influential role on how much he can work on a day-to-day basis, and thus how much data he can produce. As there were six postgraduate students sharing one evaporator, one can only imagine their frustration.

Participant: The solvents and reagents we have all, but the equipment–some equipments are missing. But we do the best we can.
LB: And with so many in the lab there must be high competition to use the equipment.

Footnotes

  1. Such as the EPSRC’s database https://equipment.data.ac.uk/ (discussed later)
  2. Such as Seeding Labs (discussed later)
  3. For example, see http://www.esrc.ac.uk/funding/guidance-for-applicants/changes-to-equipment-funding/
  4. A full description of the methodology is given in the appendix.

References

  1. Schwegmann, C. (February 2013). "Open Data in Developing Countries" (PDF). EPSI Platform. https://www.europeandataportal.eu/sites/default/files/2013_open_data_in_developing_countries.pdf. Retrieved 02 May 2017. 
  2. Bull, S. (October 2016). "Ensuring Global Equity in Open Research". Wellcome Trust. doi:10.6084/m9.figshare.4055181. https://figshare.com/articles/Review_Ensuring_global_equity_in_open_research/4055181. Retrieved 02 May 2017. 
  3. 3.0 3.1 3.2 Bezuidenhout, L.; Kelly, A.H.; Leonelli, S.; Rappert, B. (2016). "‘$100 Is Not Much To You’: Open Science and neglected accessibilities for scientific research in Africa". Critical Public Health 27 (1): 39–49. doi:10.1080/09581596.2016.1252832. 
  4. 4.0 4.1 4.2 Bezuidenhout, L.; Rappert, B. (2016). "What hinders data sharing in African science?". Fourth CODESRIA Conference on Electronic Publishing: 1–13. http://www.codesria.org/spip.php?article2564&lang=en. 
  5. 5.0 5.1 5.2 Bezuidenhout, L.; Leonelli, S.; Kelly, A.H.; Rappert, B. (2017). "Beyond the digital divide: Towards a situated approach to open data". Science and Public Policy 44 (4): 464–75. doi:10.1093/scipol/scw036. 

Notes

This presentation is faithful to the original, with only a few minor changes to presentation. In some cases important information was missing from the references, and that information was added. The original article lists references alphabetically, but this version—by design—lists them in order of appearance. Footnotes have been changed from numbers to letters as citations are currently using numbers. "Bezuidenhout et al forthcoming" (from the original) has since been published, and this version includes the updated citation.