Back close

An Approach of Statement Compression Using Classifier Algorithm with Improved Efficiency in Second

Publication Type : Conference Paper

Publisher : International Conference on Electronics and Sustainable Communication Systems

Source : International Conference on Electronics and Sustainable Communication Systems (2021)

Url : https://ieeexplore.ieee.org/document/9532679

Campus : Chennai

School : School of Engineering

Department : Computer Science

Year : 2021

Abstract : Sentence compression is the task of diminishing the length of the sentence without changing the real meaning of the sentence. In common such as natural language handling sentence pressure is a significant issue where the point is to produce a little and syntactically right form of the genuine sentence. This is a functioning exploration territory due to different reasons like expanding infiltration of mobile phones or PDAs, utilization of web or internet sources through PDA or smart phones and expansion in on-line shopping through PDAs or smart phones. Now a day's people do not have time to read large text by using this sentence compression model we can generate the small and detailed paragraph which conveys the same meaning of the original paragraph. The present circumstance requests little however complete useful sentence which fits in little screen of advanced cell than huge sentences. We have developed three sentence compression models with three different classifiers they are GRU, XGBoost and LSTM. We have evaluated the three models based on three evaluation methods Accuracy, Compression rate and Grammar & Informativeness. We have compared all the three models based on their accuracies.

Cite this Research Publication : Deepak , V. Sharmila, B. Natarajan, Shanker Shalini, Karthikeyan C, "An Approach of Statement Compression Using Classifier Algorithm with Improved Efficiency" in Second International Conference on Electronics and Sustainable Communication Systems (2021) , DOI: 10.1109/ICESC51422.2021.9532679

Admissions Apply Now