Journal:Ten simple rules for cultivating open science and collaborative R&D

From LIMSWiki
Jump to navigationJump to search
Full article title Ten simple rules for cultivating open science and collaborative R&D
Journal PLOS Computational Biology
Author(s) Masum, Hassan; Rao, Aarthi; Good, Benjamin M.; Todd, Matthew H.; Edwards, Aled M.; Chan, Leslie; Bunin, Barry A.; Su, Andrew I.; Thomas, Zakir; Bourne, Philip E.
Author affiliation(s) Waterloo Institute for Complexity and Innovation, Results for Development Institute, Scripps Research Institute, University of Sydney, University of Toronto, Collaborative Drug Discovery, Scripps Research Institute, Council of Scientific and Industrial Research, University of California San Diego
Primary contact Email: hassan dot masum at utoronto dot ca
Year published 2013
Volume and issue 9(9)
Page(s) e1003244
DOI 10.1371/journal.pcbi.1003244
ISSN 1553-7358
Distribution license Creative Commons Attribution 4.0 International
Download (PDF)


How can we address the complexity and cost of applying science to societal challenges?

Open science and collaborative R&D may help.[1][2][3] Open science has been described as "a research accelerator."[4] Open science implies open access[5] but goes beyond it: "Imagine a connected online web of scientific knowledge that integrates and connects data, computer code, chains of scientific reasoning, descriptions of open problems, and beyond ... tightly integrated with a scientific social web that directs scientists' attention where it is most valuable, releasing enormous collaborative potential."[1]

Open science and collaborative approaches are often described as open-source, by analogy with open-source software such as the operating system Linux which powers Google and Amazon — collaboratively created software which is free to use and adapt, and popular for internet infrastructure and scientific research.[6][7] However, this use of "open source" is unclear. Some people use "open source" when a project's results are free to use, others when a project's process is highly collaborative.[4]

It is clearer to classify open source and open science within a broader class of collaborative R&D, which can be defined as scalable collaboration (usually enabled by information technology) across organizational boundaries to solve R&D challenges.[8]

Many approaches to open science and collaborative R&D have been tried.[1][9] The Gene Wiki has created over 10,000 Wikipedia articles, and aims to provide one for every notable human gene.[10] The crowdsourcing platform InnoCentive has reportedly facilitated solutions to roughly half of the thousands of technical problems posed on the site, including many in life sciences such as the $1 million ALS Biomarker Prize.[11] Other examples include prizes (X-Prize[12]), scientific games (FoldIt[13]), and licensing schemes inspired by open-source software (BIOS[14]).

Collaborative R&D approaches vary in openness.[15] In some approaches, the R&D process and outputs are open to all — for example, open-science projects like the Gene Wiki described above. In other approaches which demonstrate what might be called controlled collaboration, there are strong controls on who contributes and benefits — for example, computational platforms like Collaborative Drug Discovery or InnoCentive that support both commercial and nonprofit research.[9][11]

Collaborative approaches can unleash innovation from unforeseen sources, as with crowdsourcing health technologies.[11][12][13][16] They may help in global challenges like drug development[17], as with India's OSDD (Open Source Drug Discovery) project that recruited over 7,000 volunteers[16] and an open-source drug synthesis project that improved an existing drug without increasing its cost.[18]

If you want to apply open science and collaborative R&D, what principles are useful? We suggest 10 simple rules for cultivating open science and collaborative R&D. We also offer eight conversational interviews exploring life experiences that led to these rules (see Box 1 at end).

Rule 1: Get the incentives right - Learn from the past

Why should contributors take part in your project? Learn from incentives that have worked in mass collaborations and open-source software, such as reputation building, enjoyment, cooperatively solving interesting problems that are too hard to do alone, and jointly developing tools that benefit all developers.[6][7][19] Organizational incentives can include lowering costs, tapping external innovation, implementing novel business models such as selling complementary services, and jointly competing for public admiration or grant funding. Altruism can motivate collaboration, but frequently it is not the main reason.[9] With this in mind, align individual incentives with collective benefit.[1] Look to past and present precompetitive collaborations for ways to address intellectual property and competitive concerns.[3] Share attribution with contributors so they can advance their goals and demonstrate their capabilities.

Rule 2: Make your controlled collaborations win-win-win

Perhaps completely open science seems unsuitable to you, if for example you are engaged in market-driven R&D that must recoup investments. There are ways to benefit from open science and collaborative methods while retaining appropriate controls and the opportunity to provide public benefit. You, your partners, and the public can all benefit — a win-win-win situation. You might use computational platforms to supercharge information sharing with selected partners, including public-benefit initiatives that match your mission.[9] You might use crowdsourcing to overcome roadblocks by opening up chosen parts of your R&D process to new innovators.[11] Or you might make public selected data or software tools, exporting them to the open-source realm to gain from goodwill or quality improvement.[3] Sharing can make both business and social sense, whether in implementing open standards, collaborating precompetitively, or reducing duplication of effort.[20] Keep an eye open for opportunities to "do well by doing good" by structuring initiatives for private and public benefit.[21] Collaborative approaches can benefit both public and private sectors in collaborating across competitive boundaries, connecting problems with problem solvers, and cultivating a knowledge commons.[1][9]

Rule 3: Understand what works — And what doesn't

You can save yourself frustration by not using an unsuitable collaborative method, be it a wiki without an audience or a crowdsourced research challenge without focus.[8] Consider questions like: have you learned from others who have tried the method? Do you understand when the method fails, and what is necessary for it to work? Is there a good match between the method and your goals? Are you contributing your experiences and interesting failures back to the community, thus demonstrating thought leadership? If you are interested in more effective knowledge sharing, consider low-budget opportunities such as starting an online Q&A site about open science or collaborative R&D using a platform like StackExchange. There are also opportunities to help evaluate what really works—moving beyond anecdotal evidence to case studies and metrics.

Rule 4: Lead as a coach, not a CEO

The command-and-control style doesn't work well with contributors from diverse organizations, many of whom may be volunteers.[22] And as has been said of Linus Torvalds, the founder of the open-source operating system Linux, "Linus doesn't scale": leaders of mass collaborations can become bottlenecks unless they encourage distributed workflows and leadership.[7] Be flexible about management (but strict about quality). Check your ego at the door — you're playing a team game and will be stronger when others want to contribute. Participants will feel more motivated if their contribution enriches a joint resource rather than just the leader. Can you give up exclusive ownership and credit to achieve with others what you cannot achieve alone?

Rule 5: Diversify your contributors

A powerful aspect of collaborative R&D is the potential diversity of the community — including students[16], patients[23], gamers[10], and researchers from lesser-known countries or institutions. You can use open science to attract diverse contributors by lowering barriers to participation, publicly tackling audacious challenges (see Rule 8), and making collaboration fun. Consider open licensing terms and joint or public ownership of selected outcomes to broaden your participant base.[14][15][21][24] Encourage all community members to find ways to contribute that suit their abilities and inclinations. Can you reach past your usual partners, and make it easy for others to get up to speed with what you're doing? Are there opportunities for "citizen science," perhaps through organizing many microcontributions?[1][10]

Rule 6: Diversify your customers

Can you engage the broadest possible base as beneficiaries? The science that you do in the open spreads its benefits widely, and that can attract unexpected accolades and collaborators.[1][4] Productively involving stakeholders can inform your research — for example, through participatory research strategies involving the people your efforts are meant to help.[25] Contributing to collaborative initiatives targeting human development challenges can motivate your team, and potentially lead to innovations that are transferable to for-profit markets. Neglected disease R&D is a case in point, which seems particularly suitable for collaborative pilot projects, given its lower profits, humanitarian appeal, and need for new methods.[26] If your work is commercially driven, consider humanitarian licensing approaches that encourage nonprofit applications by others to poorer demographics.[2][21]

Rule 7: Don't reinvent the wheel

The more you can use what already exists, the greater your effectiveness will be. Are there lab and computational resources that could be used when otherwise idle? Can you find people already working on elements of your problem, and organize their collective work? Before starting a new initiative, have you explored and considered joining existing ones? Piggybacking on active efforts eases prototyping and gathering enthusiastic initial users. Build on the cumulative stockpile of past open initiatives (see Rules 1 and 3).

Rule 8: Think big

For projects hoping to harness the power of mass collaboration, a major challenge can be attracting a large community of contributors. Many of the best mass collaborations orient around seemingly audacious goals like: "build a free encyclopedia of all the world's knowledge" (Wikipedia), "develop a review article for every human gene" (Gene Wiki), and "build a new operating system" (Linux). Establishing a driving, high-level purpose will help spread the idea of your project and motivate people to come have a look and see what they can do. Be ready to scale with success.

Rule 9: Encourage supportive policies and tools

Can you cultivate open science and collaborative R&D by helping to make them part of "standard operating procedure"? For example, can you encourage institutional data sharing?[24] Can you build a profiling platform of collaborative initiatives, summarizing what they have achieved and what types of collaborators they are seeking? Do you have opportunities to adopt appropriate policies in your own organization or field? A case study to learn from is the spread of open access from wishful thinking to widespread fact.[5]

Rule 10: Grow the commons

As intellectual property debates illustrate, there are legitimate differences of opinion on how best to motivate innovators' investments to generate new knowledge.[21][26] But in the long run, sharing more knowledge and tools boosts both for-profit and nonprofit research.[2][3] This growing shared resource of knowledge and tools — "the commons" — is the product of centuries of striving. It depends on cumulative win-win-win collaborations spanning organizations, nations, and generations. Can you find ways to advance your interests while remaining part of this larger narrative?[1][5][19][27]

Supporting information

Box 1. Conversations on Open Science and Collaborative R&D
Many commentators have considered challenges in translating open science and collaborative methods to biomedical research.[2][3][4][9][17][20][24][26][28][29] How can protecting intellectual property be balanced with freeing researchers to build on previous knowledge? If R&D results are collaboratively created and freely available, who will take responsibility for costly clinical trials and quality control? What will be the Linux of open-source R&D?

To explore such challenges and convey life experiences in biomedical open science and collaborative R&D, we offer eight conversational interviews by the first author of this article as supplementary material. The conversations were done on behalf of the Results for Development Institute and are with:

  • Alph Bingham, cofounder of InnoCentive; doi:10.1371/journal.pcbi.1003244.s001 (Text S1) (PDF)
  • Barry Bunin, CEO of Collaborative Drug Discovery; doi:10.1371/journal.pcbi.1003244.s002 (Text S2) (PDF)
  • Leslie Chan, open access pioneer and director of Bioline International; doi:10.1371/journal.pcbi.1003244.s003 (Text S3) (PDF)
  • Aled Edwards, director of the Structural Genomics Consortium; doi:10.1371/journal.pcbi.1003244.s004 (Text S4) (PDF)
  • Benjamin Good, coleader of the Gene Wiki initiative; doi:10.1371/journal.pcbi.1003244.s005 (Text S5) (PDF)
  • Bernard Munos, pharmaceutical innovation thought leader; doi:10.1371/journal.pcbi.1003244.s006 (Text S6) (PDF)
  • Zakir Thomas, director of India's Open Source Drug Discovery (OSDD) project; doi:10.1371/journal.pcbi.1003244.s007 (Text S7) (PDF)
  • Matt Todd, open science and drug development pioneer; doi:10.1371/journal.pcbi.1003244.s008 (Text S8) (PDF)


We thank Jean Arkedis, Robert Hecht, and Paul Wilson for comments on early versions of this article. Our thanks also go to all the colleagues and pioneers who have shared their wisdom on making collaborative R&D work.


This article was made possible by support to HM and AR from a grant by the Bill & Melinda Gates Foundation to the Results for Development Institute. The funders had no role in the preparation of the manuscript.

Competing interests

The authors have declared that no competing interests exist.


  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 Nielsen, M. (2011). Reinventing Discovery: The New Era of Networked Science. Princeton University Press. pp. 272. ISBN 9780691148908. 
  2. 2.0 2.1 2.2 2.3 National Research Council (2011). Uhlir, P.F.. ed. Designing the Microbial Research Commons: Proceedings of an International Symposium. The National Academies Press. pp. 216. ISBN 9780309219792. 
  3. 3.0 3.1 3.2 3.3 3.4 Institute of Medicine; Olson, S.; Berger, A.C. (2011). Establishing Precompetitive Collaborations to Stimulate Genomics-Driven Product Development: Workshop Summary. The National Academies Press. pp. 74. ISBN 9780309161824. 
  4. 4.0 4.1 4.2 4.3 Woelfle, M.; Olliaro, P.; Todd, M.H. (2011). "Open science is a research accelerator". Nature Chemistry 3 (10): 745-8. doi:10.1038/nchem.1149. PMID 21941234. 
  5. 5.0 5.1 5.2 "PLOS Collections: Open Access Collection". Public Library of Science. 2013. Archived from the original on 20 April 2013. Retrieved 25 April 2013. 
  6. 6.0 6.1 Prlić, A.; Procter, J.B. (2012). "Ten simple rules for the open development of scientific software". PLOS Computational Biology 8 (12): e1002802. doi:10.1371/journal.pcbi.1002802. PMC PMC3516539. PMID 23236269. 
  7. 7.0 7.1 7.2 Fogel, K. (2013). "Producing Open Source Software: How to Run a Successful Free Software Project". Retrieved 25 April 2013. 
  8. 8.0 8.1 "Collaborative Health R&D Primer". Global Health R&D Policy Assessment Center. Results for Development Institute. 2013. Archived from the original on 15 January 2013. Retrieved 25 April 2013. 
  9. 9.0 9.1 9.2 9.3 9.4 9.5 Ekins, S.; Hupcey, M.A.Z.; Williams, A.J., ed. (2011). Collaborative Computational Technologies for Biomedical Research. John Wiley & Sons, Inc. pp. 576. ISBN 9780470638033. 
  10. 10.0 10.1 10.2 Good, B.M.; Clarke, E.L.; de Alfaro, L.; Su, A.I. (2012). "The Gene Wiki in 2011: Community intelligence applied to human gene annotation". Nucleic Acids Research 40 (D1): D1255-61. doi:10.1093/nar/gkr925. PMC PMC3245148. PMID 22075991. 
  11. 11.0 11.1 11.2 11.3 Bingham, A.; Spradlin, D. (2011). The Open Innovation Marketplace: Creating Value in the Challenge Driven Enterprise. FT Press. pp. 272. ISBN 9780132311830. 
  12. 12.0 12.1 Wilson, P.; Palriwala, A. (2011). "Prizes for Global Health Technologies". Global Health R&D Policy Assessment Center. Results for Development Institute. Archived from the original on 07 November 2012. Retrieved 25 April 2013. 
  13. 13.0 13.1 Good, B.M.; Su, A.I. (2011). "Games with a scientific purpose". Genome Biology 12 (12): 135. doi:10.1186/gb-2011-12-12-135. PMID 22204700. 
  14. 14.0 14.1 Jefferson, R. (2006). "Science as social enterprise: The CAMBIA BiOS Initiative". Innovations: Technology, Governance, Globalization 1 (4): 13–44. doi:10.1162/itgg.2006.1.4.13. 
  15. 15.0 15.1 "HowOpenIsIt?". Public Library of Science. 2013. Archived from the original on 01 March 2013. Retrieved 25 April 2013. 
  16. 16.0 16.1 16.2 Vashisht, R.; Mondal, A.K.; Jain, A. et al. (2012). "Crowd sourcing a new paradigm for interactome driven drug target identification in Mycobacterium tuberculosis". PLOS One 7 (7): e39808. doi:10.1371/journal.pone.0039808. PMC PMC3395720. PMID 22808064. 
  17. 17.0 17.1 Munos, B.H.; Chin, W.W. (2011). "How to revive breakthrough innovation in the pharmaceutical industry". Science Translational Medicine 3 (89): 89cm16. doi:10.1126/scitranslmed.3002273. PMID 21715677. 
  18. Woelfle, M.; Seerden, J.P.; de Gooijer, J. et al. (2011). "Resolution of praziquantel". PLOS Neglected Tropical Diseases 5 (9): e1260. doi:10.1371/journal.pntd.0001260. PMC PMC3176743. PMID 21949890. 
  19. 19.0 19.1 Benkler, Y. (2011). The Penguin and the Leviathan: How Cooperation Triumphs over Self-Interest. Crown Business. pp. 272. ISBN 9780385525763. 
  20. 20.0 20.1 Norman, T.C.; Bountra, C.; Edwards, A.M. etc. (2011). "Leveraging crowdsourcing to facilitate the discovery of new medicines". Science Translational Medicine 3 (88): 88mr1. doi:10.1126/scitranslmed.3002678. PMID 21697527. 
  21. 21.0 21.1 21.2 21.3 Krattiger, A.; Mahoney, R.T.; Nelsen, L. et al., ed. (2007). Intellectual Property Management in Health and Agricultural Innovation: A Handbook of Best Practices. 1. MIHR-USA. ISBN 9781424320264. 
  22. Vicens, Q.; Bourne, P.E. (2007). "Ten simple rules for a successful collaboration". PLOS Computational Biology 3 (3): e44. doi:10.1371/journal.pcbi.0030044. PMC PMC1847992. PMID 17397252. 
  23. Wicks, P.; Vaughan, T.E.; Massagli, M.P.; Heywood, J. (2011). "Accelerated clinical discovery using self-reported patient data collected online and a patient-matching algorithm". Nature Biotechnology 29 (5): 411-4. doi:10.1038/nbt.1837. PMID 21516084. 
  24. 24.0 24.1 24.2 Dyke, S.O.; Hubbard, T.J. (2011). "Developing and implementing an institute-wide data sharing policy". Genome Medicine 3 (9): 60. doi:10.1186/gm276. PMC PMC3239235. PMID 21955348. 
  25. Holland, J.; Chambers, R. (2013). Who Counts?: The Power of Participatory Statistics. Practical Action. pp. 220. ISBN 9781853397721. 
  26. 26.0 26.1 26.2 Masum, H.; Harris, R. (2011). "Open Source for Neglected Diseases: Magic Bullet or Mirage?". Global Health R&D Policy Assessment Center. Results for Development Institute. Archived from the original on 06 January 2013. Retrieved 25 April 2013. 
  27. Masum, H.; Tovey, M. (2006). "Given enough minds...: Bridging the ingenuity gap". First Monday 11 (7). doi:10.5210/fm.v11i7.1370. 
  28. Marden, E. (2010). "Open source drug development: A path to more accessible drugs and diagnostics?". Minnesota Journal of Law, Science & Technology 11 (1): 217–266. 
  29. Årdal, C.; Røttingen, J.A. (2012). "Open source drug discovery in practice: A case study". PLOS Neglected Tropical Diseases 6 (9): e1827. doi:10.1371/journal.pntd.0001827. PMC PMC3447952. PMID 23029588. 


This presentation is faithful to the original, with only a few minor changes to presentation. In some cases important information was missing from the references, and that information was added. In a few cases, the URLs from 2013 were dead; they were updated with current URLs, and, when applicable, archived URLs from the Internet Archive. Box 1, which in the original appeared at top, has been combined with the supporting information at the bottom.