ProgramsView all programs
From the news
- Chancellor Amma Addresses the Parliament of World’s Religions
- Amrita Students Qualify for the European Mars Rover Challenge
Publication Type : Journal Article
Publisher : Information Processing & Management
Source : Information Processing & Management, Volume 57, Number 6, p.102362 (2020)
Keywords : Semantic relatedness scoring, Sentence representation learning, Sentiment analysis, Universal dependencies
Campus : Coimbatore
School : School of Engineering
Department : Computer Science
Year : 2020
Abstract : Background: Tree-based Long Short Term Memory (LSTM) network has become state-of-the-art for modeling the meaning of language texts as they can effectively exploit the grammatical syntax and thereby non-linear dependencies among words of the sentence. However, most of these models cannot recognize the difference in meaning caused by a change in semantic roles of words or phrases because they do not acknowledge the type of grammatical relations, also known as typed dependencies, in sentence structure. Methods: This paper proposes an enhanced LSTM architecture, called relation gated LSTM, which can model the relationship between two inputs of a sequence using a control input. We also introduce a Tree-LSTM model called Typed Dependency Tree-LSTM that uses the sentence dependency parse structure as well as the dependency type to embed sentence meaning into a dense vector. Results: The proposed model outperformed its type-unaware counterpart in two typical Natural Language Processing (NLP) tasks – Semantic Relatedness Scoring and Sentiment Analysis. The results were comparable or competitive with other state-of-the-art models. Qualitative analysis showed that changes in the voice of sentences had little effect on the model’s predicted scores, while changes in nominal (noun) words had a more significant impact. The model recognized subtle semantic relationships in sentence pairs. The magnitudes of learned typed dependency embeddings were also in agreement with human intuitions. Conclusion: The research findings imply the significance of grammatical relations in sentence modeling. The proposed models would serve as a base for future researches in this direction.
Cite this Research Publication : Jeena Kleenankandy and Abdul Nazeer K. A., “An enhanced Tree-LSTM architecture for sentence semantic modeling using typed dependencies”, Information Processing & Management, vol. 57, p. 102362, 2020.