Publication Type:

Journal Article

Source:

Smart Innovation, Systems and Technologies, Volume 27, Number VOL 1, p.437-443 (2014)

URL:

https://www.scopus.com/inward/record.url?eid=2-s2.0-84958523844&partnerID=40&md5=2da4e9597af92bbec7f042a6248601c4

Keywords:

AES, Automated essay scoring, Bag of words, Information science, Latent Semantic Analysis, Pre-processing step, Semantics, Singular value decomposition, Syntactic information, Word Sense Disambiguation

Abstract:

The reliability of automated essay scoring (AES) has been the subject of debate among educators. Most systems treat essays as a bag of words and evaluate them based on LSA, LDA or other means. Many also incorporate syntactic information about essays such as the number of spelling mistakes, number of words and so on. Towards this goal, a challenging problem is to correctly understand the semantics of the essay to be evaluated so as to differentiate the intended meaning of terms used in the context of a sentence. We incorporate an unsupervised word sense disambiguation (WSD) algorithm which measures similarity between sentences as a preprocessing step to our existing AES system. We evaluate the enhanced AES model with the Kaggle AES dataset of 1400 pre-scored text answers that were manually scored by two human raters. Based on kappa scores, while both models had weighted kappa scores comparable to the human raters, the model with the WSD outperformed the model without the WSD.

Notes:

cited By 0

Cite this Research Publication

P. Prof. Nedungadi and Raj, H., “Unsupervised word sense disambiguation for automatic essay scoring”, Smart Innovation, Systems and Technologies, vol. 27, pp. 437-443, 2014.

It appears your Web browser is not configured to display PDF files. Download adobe Acrobat or click here to download the PDF file.

207
PROGRAMS
OFFERED
5
AMRITA
CAMPUSES
15
CONSTITUENT
SCHOOLS
A
GRADE BY
NAAC, MHRD
8th
RANK(INDIA):
NIRF 2018
150+
INTERNATIONAL
PARTNERS