Please use this identifier to cite or link to this item: https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/2521
Title: Automatic OWL Based Ontology Creation from Text Sentences
Authors: Chathuranga, K.V.
Issue Date: 26-May-2014
Abstract: Computers can parse text sentences but cannot take decisions or do necessary actions based on the information gathered within the parsed text. This is due to limitations in knowledge representation in a way that the computers can understand the meaning. Due to this limitation, hard part of information parsing which is known as information understanding has to be done by humans. If this limitation needs to be overcome, then the information sources should be converted to a well known standard form where computers can parse information meaningfully. If the massive amount of information sources available needs to be manually converted in to a standard way, the process needs huge amount of knowledgeable human interaction. Therefore it is much beneficial if the above mentioned process can be automated. In this document an automation process for information conversion is explained. A text sentence is taken through various steps in order to identify the knowledge or semantics. These steps include a set of well known steps like, tokenizing and dependency identification. In this process a set of pre defined rules will be used to identify main components of a sentence. These main components are known as subject , object and predicate . If more semantics can be found for these components, these semantics are added at a later stage. At the end of a successful conversion an OWL document can be generated. Process was evaluated with various sentence formats which were collected from different persons and proved to be successful.
URI: http://hdl.handle.net/123456789/2521
Appears in Collections:Master of Computer Science - 2014

Files in This Item:
File Description SizeFormat 
11440081.pdf
  Restricted Access
18.73 MBAdobe PDFView/Open Request a copy


Items in UCSC Digital Library are protected by copyright, with all rights reserved, unless otherwise indicated.