Difference between revisions of "Journal:ROBOT: A Tool for Automating Ontology Workflows"
Shawndouglas (talk | contribs) (Saving and adding more.) |
Shawndouglas (talk | contribs) (Saving and adding more.) |
||
Line 78: | Line 78: | ||
Reasoning is one of the most important operations in ROBOT. The <tt>reason</tt> command covers two uses: logical validation of an ontology and automatic classification. In both cases, users can choose their preferred reasoner, which is used to perform inference. Large ontologies such as the Gene Ontology typically use ELK<ref name="KazakovTheIncr13">{{cite journal |title=The Incredible ELK |journal=Journal of Automated Reasoning |author=Kazakov, Y.; Krötzsch, M.; Simančík, F. |volume=53 |pages=1–61 |year=2014 |doi=10.1007/s10817-013-9296-3}}</ref>, which performs reasoning quickly using the OWL EL profile. Smaller ontologies with richer axiomatization, such as the Relations Ontology, typically use a complete OWL DL reasoner such as HermiT.<ref name="ShearerHermit08">{{cite journal |title=HermiT: A Highly-Efficient OWL Reasoner |journal=OWL: Experiences and Directions 2008 |author=Shearer, R.; Motik, B.; Horrocks, I. |pages=1–10 |year=2008 |url=http://ceur-ws.org/Vol-432/}}</ref> | Reasoning is one of the most important operations in ROBOT. The <tt>reason</tt> command covers two uses: logical validation of an ontology and automatic classification. In both cases, users can choose their preferred reasoner, which is used to perform inference. Large ontologies such as the Gene Ontology typically use ELK<ref name="KazakovTheIncr13">{{cite journal |title=The Incredible ELK |journal=Journal of Automated Reasoning |author=Kazakov, Y.; Krötzsch, M.; Simančík, F. |volume=53 |pages=1–61 |year=2014 |doi=10.1007/s10817-013-9296-3}}</ref>, which performs reasoning quickly using the OWL EL profile. Smaller ontologies with richer axiomatization, such as the Relations Ontology, typically use a complete OWL DL reasoner such as HermiT.<ref name="ShearerHermit08">{{cite journal |title=HermiT: A Highly-Efficient OWL Reasoner |journal=OWL: Experiences and Directions 2008 |author=Shearer, R.; Motik, B.; Horrocks, I. |pages=1–10 |year=2008 |url=http://ceur-ws.org/Vol-432/}}</ref> | ||
When the <tt>reason</tt> command is invoked on an input ontology, ROBOT will initiate a reasoner using the OWL API Reasoner interface. The resulting inferences are checked to ensure the ontology is logically coherent: the ontology must be consistent and have no unsatisfiable classes (i.e., classes that cannot be instantiated without introducing an inconsistency). If the ontology is incoherent, this is reported and execution halts. ROBOT can optionally perform additional checks, such as ensuring that no two classes are inferred to be equivalent post-reasoning. | |||
If the ontology is consistent, ROBOT will perform automatic classification. All direct inferred ‘subClassOf’ axioms are added to the ontology. Generation of other types of axioms can be configured. | |||
The assertion of all inferred axioms is often a fundamental step in the release process for biomedical ontologies. Many of these ontology classes only assert a single named superclass (‘A subClassOf B’, where B is another class in the ontology), and zero or more anonymous superclasses and/or anonymous equivalent classes (‘A subClassOf/equivalentTo (R some B)’, where R is an object property). These anonymous classes allow the reasoner to make inferences, which are then asserted. Therefore, in the release version of an ontology, a class may have more than one named superclass. | |||
The <tt>reason</tt> command has additional “helper” commands. The <tt>relax</tt> command asserts entailed subClassOf axioms according to a simple structural rule: an expression ‘A equivalentTo (R some B) and …’ entails ‘A subClassOf R some B’. This can be useful as consumers of bio-ontologies often expect to navigate these expressions, e.g., partonomy in GO and Uberon. The <tt>relax</tt> command relieves the ontology developer from the need to assert these in addition to the equivalence axioms, and as such it is also often included in release workflows. Finally, the <tt>reduce</tt> command removes redundant ‘subClassOf’ axioms, and can be used after <tt>relax</tt> to remove duplicate axioms that were asserted in that step. | |||
The <tt>materialize</tt> command uses the Expression Materializing Reasoner (EMR) extension to assert inferred expressions of the form ‘A subClassOf R some B’.<ref name="GitHubEMR">{{cite web |url=https://github.com/owlcollab/expression-materializing-reasoner |title=owlcollab / expression-materializing-reasoner |work=GitHub |accessdate=09 October 2018}}</ref> Where the <tt>reason</tt> command asserts inferred named superclasses, <tt>materialize</tt> asserts anonymous superclasses. This is not part of the standard release cycle but can be beneficial for creating complete ontology subsets. | |||
===Working with external ontologies=== | |||
The OBO Foundry aims to coordinate ontologies in a modular fashion, such that parts of some ontologies can be used as building blocks for other ontologies. For example, the ChEBI chemical entities ontology<ref name="HastingsTheChe13">{{cite journal |title=The ChEBI reference database and ontology for biologically relevant chemistry: Enhancements for 2013 |journal=Nucleic Acids Research |author=Hastings, J.; de Matos, P.; Dekker, A. et al. |volume=41 |issue=DB1 |pages=D456-63 |year=2013 |doi=10.1093/nar/gks1146 |pmid=23180789 |pmc=PMC3531142}}</ref> is used to construct OWL definitions for metabolic processes and activities in the Gene Ontology.<ref name="HillDove13">{{cite journal |title=Dovetailing biology and chemistry: Integrating the Gene Ontology with the ChEBI chemical ontology |journal=BMC Genomics |author=Hill, D.P.; Adams, N.; Bada, M.A. et al. |volume=14 |at=513 |year=2013 |doi=10.1186/1471-2164-14-513 |pmid=23895341 |pmc=PMC3733925}}</ref> There are a variety of different strategies for leveraging external ontologies and managing dependencies between ontologies, depending on the use case. | |||
====Extractions==== | |||
The <tt>extract</tt> command creates a module based on a set of entities to extract (the “seed”). There are four different extraction methods (as specified by the <tt>--method</tt> option): MIREOT, TOP, BOT, and STAR. | |||
ROBOT’s MIREOT extraction method is based on the principle of the same name<ref name="CourtotMIREOT11" /> and requires that one or more “bottom” entities are specified. Optionally, one or more “top” entities can also be specified. The command extracts all the “bottom” level entities and their ancestors up to the “top” level from the input ontology. If no “top” entities are provided, ancestors up to the top-level entity (‘owl: Thing’) are included. | |||
The TOP, BOT, and STAR methods make use of the OWL API Syntactic Locality Module Extraction (SLME) implementation, which is guaranteed to capture all [[information]] logically relevant to the seed set.<ref name="GrauModular08">{{cite journal |title=Modular Reuse of Ontologies: Theory and Practice |journal=Journal of Artificial Intelligence Research |author=Grau, B.C.; Horrocks, I.; Kazakov, Y. et al. |volume=31 |pages=273–318 |year=2008 |doi=10.1613/jair.2375}}</ref> The BOT method (“bottom”) includes all relationships between the input entities and their ancestors. The TOP method includes all relationships between the input entities and their descendants. Finally, the STAR method only includes all relationships between input entities. The STAR method produces the smallest outputs, while the TOP method typically produces the largest outputs. | |||
In order to support ontology term provenance, the <tt>extract</tt> command has an <tt>--annotate-with-source true</tt> option that will annotate each extracted term with the URL of the source ontology that it is extracted from. | |||
====Extraction==== | |||
Revision as of 20:28, 17 February 2020
Full article title | ROBOT: A Tool for Automating Ontology Workflows |
---|---|
Journal | BMC Bioinformatics |
Author(s) | Jackson, Rebecca C.; Balhoff, James, P.; Douglass, Eric; Harris, Nomi L.; Mungall, Christopher J.; Overton, James A. |
Author affiliation(s) | Knocean, Inc.; University of North Carolina; Lawrence Berkeley National Laboratory |
Primary contact | Email: via SpringerLink |
Year published | 2019 |
Volume and issue | 20 |
Page(s) | 407 |
DOI | 10.1186/s12859-019-3002-3 |
ISSN | 1471-2105 |
Distribution license | Creative Commons Attribution 4.0 International |
Website | https://link.springer.com/article/10.1186/s12859-019-3002-3 |
Download | https://link.springer.com/content/pdf/10.1186/s12859-019-3002-3.pdf (PDF) |
This article should be considered a work in progress and incomplete. Consider this article incomplete until this notice is removed. |
Abstract
Background: Ontologies are invaluable in the life sciences, but building and maintaining ontologies often requires a challenging number of distinct tasks such as running automated reasoners and quality control checks, extracting dependencies and application-specific subsets, generating standard reports, and generating release files in multiple formats. Similar to more general software development, automation is the key to executing and managing these tasks effectively and to releasing more robust products in standard forms.
For ontologies using the Web Ontology Language (OWL), the OWL API (application programming interface) Java library is the foundation for a range of software tools, including the Protégé ontology editor. In the Open Biological and Biomedical Ontologies (OBO) community, we recognized the need to package a wide range of low-level OWL API functionality into a library of common higher-level operations and to make those operations available as a command-line tool.
Results: ROBOT (a recursive acronym for “ROBOT is an OBO Tool”) is an open-source library and command-line tool for automating ontology development tasks. The library can be called from any programming language that runs on the Java Virtual Machine (JVM). Most usage is through the command-line tool, which runs on macOS, Linux, and Windows. ROBOT provides ontology processing commands for a variety of tasks, including commands for converting formats, running a reasoner, creating import modules, running reports, and various other tasks. These commands can be combined into larger workflows using a separate task execution system such as GNU Make, and workflows can be automatically executed within continuous integration systems.
Conclusions: ROBOT supports automation of a wide range of ontology development tasks, focusing on OBO conventions. It packages common high-level ontology development functionality into a convenient library and makes it easy to configure, combine, and execute individual tasks in comprehensive, automated workflows. This helps ontology developers to efficiently create, maintain, and release high-quality ontologies so they can spend more time focusing on development tasks. It also helps guarantee released ontologies are free of certain types of logical errors and conform to standard quality control checks, increasing the overall robustness and efficiency of the ontology development lifecycle.
Keywords: ontology development, automation, ontology release, reasoning, workflows, quality control, import management
Background
Ontologies are vital parts of the informatics ecosystem, supporting life science research, enabling analysis of high-throughput datasets, data standardization and integration, search, and discovery. However, there is a lack of tools supporting the complete ontology development lifecycle, especially when compared with the software development lifecycle. This has resulted in many groups developing their own ad hoc ontology development workflows, often with time-consuming and inefficient manual steps. In some cases, groups release ontologies without any kind of systematic workflow or quality control process, which can result in errors or problems with downstream applications or analyses.
Noy et al. (2010) describe a general ontology lifecycle, with a focus on bio-ontologies.[1] First, requirements for the ontology are gathered. Then, the ontology is collaboratively developed in an ontology editor such as Protégé.[2] Once the requirements have been fulfilled, the ontology is published and feedback is solicited from the community. Feedback is integrated back into development, and the ontology is continuously updated and released. At any point after the initial publication, the ontology may be deployed in other applications.
In broad strokes, this ontology development lifecycle reflects much of our experience of ontology development in the Open Biological and Biomedical Ontologies (OBO) community[3], circa 2010. A wide range of Semantic Web-based software exists to support these steps, including many tools for Web Ontology Language (OWL) ontology development. In practice, though, the OBO community has relied predominantly on the free and open-source Protégé OWL editor for manual editing and conversion, and on a small set of other tools supporting OBO conventions.
Other than Protégé, the most prominent suite of tools used by the OBO community has been the Onto-animal suite, developed by the He Group[4], which includes Ontobee[5], Ontofox[6], and Ontorat.[7] These tools are free web services backed by a Virtuoso triplestore loaded with the latest version of all available OBO community ontologies, as well as some other ontologies. Ontobee is an ontology term browser. Ontofox implements the MIREOT term extraction method.[8] Ontorat implements template-based ontology term creation. Together with a few other tools, these support an extensible ontology development strategy[9] covering a range of ontology development tasks, many of which can combined and automated using a sequence of web-based application programming interface (API) calls.
The core operations of the Onto-Animal suite are driven by SPARQL queries against the centralized triplestore. This results in a number of limitations. First, only the specific version of each ontology loaded into that triplestore can be used. This is a particularly severe limitation during ontology development. Second, processing is done on the centralized server, limiting the processing power available to any user. Third, SPARQL has limited utility when working with OWL logical axioms.
These limitations are mitigated by running software locally, loading the desired versions of the desired ontologies, and using OWL API[10] for OWL-native processing. A number of tools used in the OBO community have done precisely this. We have seen a spectrum of development, from tools that are focused on a single project, to tools used by a dozen related projects, to the current push for tools that are shared across the OBO community.
Slimmer[11], created as part of the eNanoMapper ontology project[12], uses OWL API to create ontology subsets (also known as “slims”). A configuration file allows the user to specify which terms to include and which annotations to include on those terms. OntoPilot[13], developed for the Plant Phenotype Ontology, uses OWL API via Jython (a version of Python that runs on the Java Virtual Machine) to provide an integrated ontology development framework, including term imports, term creation, releases, and documentation.
The lack of automation seen circa 2010 led directly to a lack of standardization, with each ontology editor or group adopting a slightly different approach to manual editing in Protégé. This diversity of practices, even within the OBO community, made it a challenge to develop tools to serve multiple ontology projects. OWLTools[14] was designed for use by multiple OBO ontology projects, providing convenience methods on top of the OWL API. OWLTools includes the OBO Ontology Release Tool (OORT)[15], a command-line tool to release OWL- and OBO-format ontologies. OORT provides a series of basic commands to create a release pipeline for an ontology, including module extraction with MIREOT, support for multiple input ontologies, reasoning, and creation of "main" and "simple" release products.
ROBOT (a recursive acronym for “ROBOT is an OBO Tool”) was developed to replace OWLTools and OORT with a more modular and maintainable code base. It builds on previous experience to include a comprehensive set of automation capabilities to support an even wider range of OBO projects. Development began in 2015 and continues today, with more than 1000 commits from a dozen contributors. ROBOT is freely available open-source software. Although we do not track our users, a recent GitHub search shows that at least 26 ontology projects in the OBO community have adopted ROBOT.
Implementation
Overview
ROBOT provides a standardized yet configurable way to support the ontology development lifecycle via a library of common high-level functionality and a command-line interface. ROBOT builds on OWL API and is compatible with all ontology syntaxes that OWL API supports: RDF/XML, OWL/XML, Turtle, OWL Functional Syntax, OWL Manchester Syntax, and OBO format. The source code is written in Java and is available from our GitHub repository[16] under an open source (BSD 3) license. It is also released as a Java library on Maven Central. ROBOT code can be used from any programming language that runs on the Java Virtual Machine (JVM). The command-line tool is packaged as a JAR file that can be run on Unix (including macOS and Linux), Windows, and other platforms supported by the JVM. This JAR file is available for download from the ROBOT GitHub site[16], along with platform-specific scripts for using ROBOT from the command line. Installation instructions and documentation are available from the OBO Library.
Architecture
We previously described the basic architecture of the tool[17], which we summarize here.
The ROBOT source code consists of two parts: ‘robot-core’ and ‘robot-command.’ ‘robot-core’ is a library supporting common ontology development tasks, which we call “operations.” ‘robot-command’ provides a command-line interface divided into “commands,” each of which wraps a ‘robot-core’ operation.
Most ROBOT operations package low-level functionality provided by OWL API into high-level functionality common to ontology development workflows in the OBO community. For best compatibility, we aim to match the exact version of OWL API used by ROBOT with the exact version used by the latest Protégé release. Some operations use Apache Jena.[18] Each operation works with Java objects that represent OWL ontologies, OWL reasoners, OWL classes, etc., while each command works with command-line option strings and files. The commands also perform various conversion and validation steps. The command-line interface uses the Apache Commons CLI library[19] for parsing commands.
Each operation has a set of unit tests, built with JUnit[20], that are executed each time the final product (the JAR file) is generated. Each command in ROBOT is documented in its own web page (e.g., http://robot.obolibrary.org/reason). The web pages are authored in Markdown format and contain embedded command-line examples that are parsed and executed as part of our integration tests, with the results compared against a “gold standard” set of outputs. ROBOT’s ‘diff’ functionality is used when comparing ontology files, otherwise standard file comparison is used. This helps ensure correctness and consistency of documentation and code. The unit tests and integration tests are executed on any pull request onto the codebase via Travis continuous integration (Travis CI)[21] so that contributions to the codebase are verified.
Commands and operations
ROBOT currently provides 15 operations (in the ‘robot-core’ library) and 19 commands (for the command-line interface). Some commands are quite specialized, and most ontology projects will not make use of all of them. Here we provide an overview of the most common and general commands. In each case, the core functionality is supported by operations in the ‘robot-core’ library, that can be used independently of the command-line interface from any programming language that runs on the JVM.
Conversion
A variety of OWL ontology formats are supported, including RDF/XML, Turtle, Manchester, OBO format, and more. To enable further interoperability, ROBOT includes a convert command to change between supported ontology formats. A complete list of supported formats can be found in the convert documentation.
Reasoning
Reasoning is one of the most important operations in ROBOT. The reason command covers two uses: logical validation of an ontology and automatic classification. In both cases, users can choose their preferred reasoner, which is used to perform inference. Large ontologies such as the Gene Ontology typically use ELK[22], which performs reasoning quickly using the OWL EL profile. Smaller ontologies with richer axiomatization, such as the Relations Ontology, typically use a complete OWL DL reasoner such as HermiT.[23]
When the reason command is invoked on an input ontology, ROBOT will initiate a reasoner using the OWL API Reasoner interface. The resulting inferences are checked to ensure the ontology is logically coherent: the ontology must be consistent and have no unsatisfiable classes (i.e., classes that cannot be instantiated without introducing an inconsistency). If the ontology is incoherent, this is reported and execution halts. ROBOT can optionally perform additional checks, such as ensuring that no two classes are inferred to be equivalent post-reasoning.
If the ontology is consistent, ROBOT will perform automatic classification. All direct inferred ‘subClassOf’ axioms are added to the ontology. Generation of other types of axioms can be configured.
The assertion of all inferred axioms is often a fundamental step in the release process for biomedical ontologies. Many of these ontology classes only assert a single named superclass (‘A subClassOf B’, where B is another class in the ontology), and zero or more anonymous superclasses and/or anonymous equivalent classes (‘A subClassOf/equivalentTo (R some B)’, where R is an object property). These anonymous classes allow the reasoner to make inferences, which are then asserted. Therefore, in the release version of an ontology, a class may have more than one named superclass.
The reason command has additional “helper” commands. The relax command asserts entailed subClassOf axioms according to a simple structural rule: an expression ‘A equivalentTo (R some B) and …’ entails ‘A subClassOf R some B’. This can be useful as consumers of bio-ontologies often expect to navigate these expressions, e.g., partonomy in GO and Uberon. The relax command relieves the ontology developer from the need to assert these in addition to the equivalence axioms, and as such it is also often included in release workflows. Finally, the reduce command removes redundant ‘subClassOf’ axioms, and can be used after relax to remove duplicate axioms that were asserted in that step.
The materialize command uses the Expression Materializing Reasoner (EMR) extension to assert inferred expressions of the form ‘A subClassOf R some B’.[24] Where the reason command asserts inferred named superclasses, materialize asserts anonymous superclasses. This is not part of the standard release cycle but can be beneficial for creating complete ontology subsets.
Working with external ontologies
The OBO Foundry aims to coordinate ontologies in a modular fashion, such that parts of some ontologies can be used as building blocks for other ontologies. For example, the ChEBI chemical entities ontology[25] is used to construct OWL definitions for metabolic processes and activities in the Gene Ontology.[26] There are a variety of different strategies for leveraging external ontologies and managing dependencies between ontologies, depending on the use case.
Extractions
The extract command creates a module based on a set of entities to extract (the “seed”). There are four different extraction methods (as specified by the --method option): MIREOT, TOP, BOT, and STAR.
ROBOT’s MIREOT extraction method is based on the principle of the same name[8] and requires that one or more “bottom” entities are specified. Optionally, one or more “top” entities can also be specified. The command extracts all the “bottom” level entities and their ancestors up to the “top” level from the input ontology. If no “top” entities are provided, ancestors up to the top-level entity (‘owl: Thing’) are included.
The TOP, BOT, and STAR methods make use of the OWL API Syntactic Locality Module Extraction (SLME) implementation, which is guaranteed to capture all information logically relevant to the seed set.[27] The BOT method (“bottom”) includes all relationships between the input entities and their ancestors. The TOP method includes all relationships between the input entities and their descendants. Finally, the STAR method only includes all relationships between input entities. The STAR method produces the smallest outputs, while the TOP method typically produces the largest outputs.
In order to support ontology term provenance, the extract command has an --annotate-with-source true option that will annotate each extracted term with the URL of the source ontology that it is extracted from.
Extraction
References
- ↑ Noy, N.; Tudorache, T.; Nyulas, C. et al. (2010). "The ontology life cycle: Integrated tools for editing, publishing, peer review, and evolution of ontologies". AMIA Annual Symposium Proceedings 2010: 552–6. PMC PMC3041389. PMID 21347039. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3041389.
- ↑ Horridge, M.; Tsarkov, D. (2006). "Supporting early adoption of OWL 1.1 with Protégé-OWL and FaCT++". OWL: Experiences and Directions 2006: 1–7. http://webont.org/owled/2006/accepted06.html.
- ↑ Smith, B.; Ashburner, M.; Rosse, C. et al. (2007). "The OBO Foundry: Coordinated evolution of ontologies to support biomedical data integration". Nature Biotechnology 25 (11): 1251–5. doi:10.1038/nbt1346. PMC PMC2814061. PMID 17989687. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2814061.
- ↑ He, Y.; Zheng, J.; Lin, Y. (2015). "Onto-animal tools for reusing ontologies, generating and editing ontology terms, and dereferencing ontology terms". Proceedings of the 2015 International Conference on Biomedical Ontology: 1–2. http://ceur-ws.org/Vol-1515/.
- ↑ Xiang, Z.; Mungall, C.; Ruttenberg, A. et al. (2012). "Ontobee: A Linked Data Server and Browser for Ontology Terms". Proceedings of the Second International Conference on Biomedical Ontology: 279–81. http://ceur-ws.org/Vol-833/.
- ↑ Xiang, Z.; Courtot, M.; Brinkman, R.R. et al. (2010). "OntoFox: Web-based support for ontology reuse". BMC Research Notes 3: 175. doi:10.1186/1756-0500-3-175. PMC PMC2911465. PMID 20569493. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2911465.
- ↑ Xiang, Z.; Zheng, J.; Lin. Y. et al. (2015). "Ontorat: Automatic generation of new ontology terms, annotations, and axioms based on ontology design patterns". Journal of Biomedical Semantics 6: 4. doi:10.1186/2041-1480-6-4. PMC PMC4362828. PMID 25785185. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4362828.
- ↑ 8.0 8.1 Courtot, M.; Gibson, F.; Lister, A. L. et al. (2011). "MIREOT: The minimum information to reference an external ontology term". Applied Ontology 6 (1): 23–33. doi:10.3233/AO-2011-0087.
- ↑ He, Y.; Xiang, Z.; Zheng, J. et al. (2018). "The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperabilitye". Journal of Biomedical Semantics 9: 3. doi:10.1186/s13326-017-0169-2.
- ↑ Horridge, M.; Bechhofer, S.; Noppens, O. (2007). "Igniting the OWL 1.1 Touch Paper: The OWL API". OWL: Experiences and Directions 2007: 1–9. http://webont.org/owled/2007/Proceedings.html.
- ↑ "enanomapper / slimmer". GitHub. https://github.com/enanomapper/slimmer/. Retrieved 21 May 2019.
- ↑ Hastings, J.; Jeliazkova, N.; Owen, G. et al. (2015). "eNanoMapper: harnessing ontologies to enable data integration for nanomaterial risk assessment". Journal of Biomedical Semantics 6: 10. doi:10.1186/s13326-015-0005-5. PMC PMC4374589. PMID 25815161. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4374589.
- ↑ Stucky, B.J.; Luc, A. (2017). "OntoPilot: New Software to Simplify and Accelerate Ontology Development and Deployment". Biodiversity Information Science and Standards 1: e20192. doi:10.3897/tdwgproceedings.1.20192.
- ↑ "owlcollab / owltools". GitHub. https://github.com/owlcollab/owltools. Retrieved 21 May 2019.
- ↑ "owlcollab / owltools : Oort Intro". GitHub. https://github.com/owlcollab/owltools/wiki/Oort-Intro. Retrieved 21 May 2019.
- ↑ 16.0 16.1 "ontodev / robot". GitHub. https://github.com/ontodev/robot. Retrieved 09 October 2018.
- ↑ Overton, J.A.; Dietze, H.; Essaid, S. et al. (2015). "ROBOT: A command-line tool for ontology development". Proceedings of the 2015 International Conference on Biomedical Ontology: 1–2. http://ceur-ws.org/Vol-1515/.
- ↑ Carroll, J.J.; Dickinson. I.; Dollin, C. et al. (2004). "Jena: Implementing the semantic web recommendations". WWW Alt. '04: Proceedings of the 13th international World Wide Web conference on Alternate track papers & posters: 74–83. doi:10.1145/1013367.1013381.
- ↑ "Commons CLI". Apache Commons. https://commons.apache.org/proper/commons-cli/. Retrieved 23 May 2019.
- ↑ "JUnit". junit.org. https://junit.org/junit4/. Retrieved 21 May 2019.
- ↑ "Travis CI". Travis CI, GmbH. https://travis-ci.org/. Retrieved 21 May 2019.
- ↑ Kazakov, Y.; Krötzsch, M.; Simančík, F. (2014). "The Incredible ELK". Journal of Automated Reasoning 53: 1–61. doi:10.1007/s10817-013-9296-3.
- ↑ Shearer, R.; Motik, B.; Horrocks, I. (2008). "HermiT: A Highly-Efficient OWL Reasoner". OWL: Experiences and Directions 2008: 1–10. http://ceur-ws.org/Vol-432/.
- ↑ "owlcollab / expression-materializing-reasoner". GitHub. https://github.com/owlcollab/expression-materializing-reasoner. Retrieved 09 October 2018.
- ↑ Hastings, J.; de Matos, P.; Dekker, A. et al. (2013). "The ChEBI reference database and ontology for biologically relevant chemistry: Enhancements for 2013". Nucleic Acids Research 41 (DB1): D456-63. doi:10.1093/nar/gks1146. PMC PMC3531142. PMID 23180789. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3531142.
- ↑ Hill, D.P.; Adams, N.; Bada, M.A. et al. (2013). "Dovetailing biology and chemistry: Integrating the Gene Ontology with the ChEBI chemical ontology". BMC Genomics 14: 513. doi:10.1186/1471-2164-14-513. PMC PMC3733925. PMID 23895341. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3733925.
- ↑ Grau, B.C.; Horrocks, I.; Kazakov, Y. et al. (2008). "Modular Reuse of Ontologies: Theory and Practice". Journal of Artificial Intelligence Research 31: 273–318. doi:10.1613/jair.2375.
Notes
This presentation is faithful to the original, with only a few minor changes to presentation, spelling, and grammar. We also added PMCID and DOI when they were missing from the original reference.