Review Comment:
---Review Comments: --
This is a paper submitted as 'Tools and Systems Report'. Thus, I would like to start by summarizing my thoughts in light of the Journal's requirements (https://www.semantic-web-journal.net/reviewers).
(1) The author has mentioned the requirements of the tool and clearly described its capabilities.
(2) The described tool is accessible on the web: GitHub. Moreover, data files are available in the given repository and contain a README file which makes it easy for users to access the data and install the tool. The use cases used for the evaluation and the developed ontologies (i.e., links) are also available in the repository.
(3) Quality, importance, and impact of the described tool or system
Quality:
Considering the user-friendliness, functions, and installation, I found that the tool is fine. However, when I tried this tool with Protégé 5.0.0, only the class hierarchy validation functions worked and the other functions did not (ie, the errors were shown in the Protégé logs). Finally, I tried the tool with Protégé 5.5.0, then, all the functions described in sections 3.1 -3.5 worked. But, I occasionally experienced Protege windows crashing, This may not be a big issue when creating an ontology for testing purposes. But definitely, this will be a hassle in developing real ontologies and Real users may feel a bad experience. I noticed that this issue has been reported by two users in your User Study/Test.
Importance:
The author has addressed a timely issue by introducing a tool.
However, based on my opinion, the authors have not provided sufficient convincing evidence to show the impact of the tool in developing a high-quality ontology. For instance, the authors stated that Ontoseer, based on the user study (21 respondents), helps to reduce the ontology modeling time. But for a journal publication at this level, I was expecting rigor testing that was somewhat akin to your evaluation of the ODP recommendation. For example, The authors can give sample use cases for developers to model an ontology/ontologies with or without Ontoseer. Then, authors can keep track of the modeling time/duration that will be spent by developers (i.e., two independent groups) and can analyze the modeling time. Additionally, evaluating the user-friendly of the tool is crucial, although it is not presented in this paper.
(4) Clarity, illustration, and readability of the paper are good. The authors have clearly described the capabilities. But, they didn’t discuss the limitations of the tool, which is equally important.
*** Comments for the sections***
--Abstract: Page 1--
Line 26: “Apart from this, there are aspects such as modularity and reusability that should be taken care of”, In the abstract, you mentioned the two ontology quality characteristics: modularity and reusability. I could see how you addressed the reusability. However, it is not clear how you handled the modularity using Ontoseer. "Modularity" is a crucial characteristic of an ontology. Therefore, it is necessary to clearly explain how you address the modularity in the appropriate Section.
Line 30: “we developed a tool named OntoSeer, that monitors the ontology development process and provides suggestions in real-time to improve the quality of the ontology under development”. In this case, I thought that OntoSeer automatically tracks and displays modeling pitfalls and suggestions. Later, however, I realized that tool users want to execute/click the functions anytime they wish to review the specified quality aspects (i.e., class hierarchy validation, ODP recommendation, etc.). The mentioned sentence, in my opinion, misleads the reader right away. I, therefore, recommend altering the sentence.
--Introduction:--
Page 2, Line 2: “Experienced developers face issues while building ontologies, and this problem only magnifies in the case of inexperienced ontology developers…” . What are the issues you are referring to? It is unclear to me the problems experienced by novice developers. Please give some examples or cases.
Page 2, Line 5 – It is unclear to me how you addressed the ontology modularity using ODP. I would like to suggest you define the modularity you are referring to. Then, describe how it is addressed through OntoSeer.
Page 2, Line 12 – Here, you only mentioned reusability, no modularity, why?
--Related Work:--
Although the authors have covered several relevant works, some significant works are lacking. i.e., XDAnalyzer of the XDTool which used ODPs (http://ontologydesignpatterns.org/wiki/Main_Page)
• i.e., “XD Analyzer: The aim of this tool is to provide suggestions and feedback to the user with respect to how good practices in ontology design have been followed, according to the XD methodology (for instance missing labels and comments, isolated entities, unused imported ontologies).”
• RepOSE – debugging tool
Optional (Readings)
o See Ontology Summit 2013 – Tools http://ontolog.cim3.net/OntologySummit/2013/tracks.html
o Protégé Debugging tools
Moreover, I suggest adding a table that summarizes and compares the related tools/works.
--Approach--
3.1. Class, Property and Vocabulary Recommendation
This function retrieves a related set of vocabularies from the existing repositories for the selected classes/properties. It's great to have such a feature for ontology developers. However, as I observed, in order to add the suggested vocabulary, developers need to spend some time examining the proposed vocabulary and need to understand the way of adding it. In this situation, it would be better if the tool can provide some modeling suggestions with the existing ontology and the suggested vocabulary. But I know it is not straightforward and needs to do some research and required a reasonable time to address it.
3.2. Class and Property Name Recommendation
This function properly works as explained in the paper. Mainly, this explains the suitable naming conventions for classes and properties that help in improving the readability and maintenance of an ontology.
According to what I understand, the current tool merely recommends suitable naming conventions; then, the user must remember it in order to add it later. This can be solved (i.e., the usability of the tool can be improved) by enabling users to add a suitable convention to the ontology at the same time when OntoSeer suggests naming conventions.
3.3. Axiom Recommendation
This function recommends some appropriate axioms to be added to the ontology.
Similar to my suggestion proposed for Section 3.2, For this function also, it is great if users/developers could add appropriate axioms at the same time when OntoSeer suggests them (i.e., axioms).
3.4. ODP Recommendation
This is a significant function for both experienced and inexperienced developers. However, it could be challenging for inexperienced developers how the suggested ODP/s are mapped with the existing ontology. In this case, if OntoSeer can show the example modeling solution with the suggested ODP is useful as performed in the XD Analyzer tool.
3.5. Class Hierarchy Validation
This function validates the class hierarchy according to OntoClean. This is also a significant function for both experienced and inexperienced developers.
In the paper, you have explained OntoClean characteristics (i.e., Rigidity, Identity, Unity), but, you have not explained OntoClean principles/ constraints which are crucial when assessing the hierarchies. Better to explain them.
Ex:- OntoClean explains: “Given two properties p and q, where q subsumes p, the following constraints must hold:
1. If q is anti-rigid, then p must be anti-rigid
2. If q carries an identity criterion, then p must carry the same criterion
3. If q carries a unity criterion, then p must carry the same criterion
4. If q has anti-unity, then p must also have anti-unity
5. If q is externally dependent, then p must be……"
Moreover, If you can display these OntoClean principles as hints in OntoSeer, it is good for developers to get a good understanding of the hierarchical validation.
Furthermore, when using this feature through OntoSeer, I initially struggled with how to choose or determine which classes should be put to the test: Whether I need to test the hierarchical classes in the existing ontology OR whether I need to test the hierarchical classes before adding them to the existing ontology. This situation has not been clearly explained in the paper.
Additionally, there is no reporting/recording feature for the hierarchy validation that was done. This would help post-validation after adding classes to the ontology in the case of a new addition.
*** Suggestions for the tool’s Usability ***
• Having a feature to keep or record the recommendation/validations is helpful, but this tool does not have one.
• Need some improvements to the tool's usability:, e.g.:- some features (adding naming convention, axioms adding, etc.) can be automated rather than allowing developers to do them manually.
--User Study--
+ The necessary proof pertaining to the user study has been presented.
+ The results' explanation has also been given.
- Page 11, Line 39: “..Fourteen, that is, 66.67% of users believed that OntoSeer saves modeling time while, the remaining seven chose to be neutral……”
I do not satisfy with the test performed for modeling time. Testing the modeling efficiency is crucial for this type of tool (i.e., as it is one of the main objectives). Providing rigorous testing is required at this level. Please kindly read my comments given at the beginning: “For instance, the authors stated that Ontoseer, based on the user study (21 respondents), helps to reduce…………….”
--Conclusion--
- Quite General. Repeating the same things in the content.
- Limitations: No clear explanation. In addition to the summary of contents, it is important to add a rigorous discussion highlighting limitations.
|