Today, the development of applications explaining entailments and non-entailments in ontology based systems is significant for several reasons. As ontologies grow in size and complexity and serve numerous applications across several domains, from artificial intelligence (AI) to the Semantic Web, users need to be made aware of the logics that govern the relationships and inferences within the ontologies.
Despite that explanations are crucial, there is a current lack of specific tools offering intuitive and user friendly explanations of both entailments and non-entailments in ontologies. Ontology management tools like Protégé have powerful reasoning capabilities within their functionality but they often lack integrated explanation mechanisms that would be accessible to a user who is not an expert. This lack of support creates barriers for users who need to understand the justification of complex ontological structures and relationships quickly. In addition, Protégé and similar tools are not able to generate explanations for non-entailments, i.e. axioms that do not logically follow from an ontology when in reality they should.
This work tries to address this problem by developing a committed application that gives basic information about the ontologies but also provides explanations for entailments and non-entailments. The application will make ontology reasoning more accessible and transparent. It allows ontology engineers to get justifications for why axioms hold or not in the given ontology. It integrates reasoner plugins like HermiT and Pellet with visualization techniques for better understanding and handling of ontological information.