Qualification: 
Ph.D

Jeena Kleenankandy is a Faculty Associate in the Department of Computer Science and Engineering at Amrita School of Engineering, Amrita Vishwa Vidyapeetham University – Coimbatore. She has completed her research work under Dr. K A Abdul Nazeer in the Department of Computer Science and Engineering, NIT Calicut and is waiting for her final examination . Her Ph.D thesis is titled "Linguistically motivated deep learning models for measuring semantic textual similarity". Her work mostly involves designing computational models to learn meaning representation of natural language texts. She started her career as Project Engineer at Wipro Technologies, Kochi and later switched to the teaching profession by joining Chinmaya Institute of Technology Kannur as an Assistant Professor . She holds an M.Tech degree in Software Engineering (2007) from Cochin University of Science and Technology, Kerala and B.Tech in Computer Science and Engineering (2005) from College of Engineering Thalassery, Kannur, Kerala.

Her current research interest is in Deep Learning and Natural Language Processing focusing on its application in various cross-domains.

Seminars/Workshops/FDPs attended/participated (in past 5 years)

  1. First Conference  on Deployable AI, Explainable AI edition conducted virtually by IIT Madras from 16th to 18th June 2021 (4:30 – 8:30pm).
  2. Three day Train The Trainer (TTT) Program on Java Programming, conducted online by Infosys Campus Connect from June 16 - June 18, 2021.
  3. Tenth RBCDSAI Workshop on Recent Progress in Data Science & AI, Annual Research Showcase. IIT Madras conducted online on 29 May 2021.
  4. Webinar on “Student Satisfaction and Progression” conducted by NAAC on 19 May 2021.
  5. Webinar on Outcome-Based education, conducted by NAAC  on 18 May 2021.
  6. COLING 2020 - The 28th International Conference on Computational Linguistics conducted virtually from 8 to 13 December 2020.
  7. National workshop: Beyond Glass Barriers 2020 , IEEE Malabar Subsection, NIT Calicut  from 27 to 29 November 2020.
  8. EMNLP 20202 - The 2020 Conference on Empirical Methods in Natural Language Processing conducted virtually from 16 to 20 November 2020.
  9. Five day FDP on ‘Mathematics for Data Science’, Department of Basic Science & Humanities, Muthoot Institute of Technology & Science conducted virtually from 10 to 14 August 2020
  10. Five day FDP on "Enriching Research Acumen”, Department of Computer Science and Engineering, Christ College of Engineering, Kerala conducted online from 25 to 29 July 2020.
  11. Improving Natural Language Understanding through Adversarial Testing conducted by Stanford ONLINE on 24 July 2020.
  12. Webinar on Deep learning for NLP Applications, Department of Computer Science and Engineering, LBS College of Engineering, Kasaragod on 7 July 2020.
  13. RBCDSAI's International Summit on Data Science and AI, IIT Madras from 18 to 20 June 2020.
  14. Data Science Symposium conducted at IIT Palakkad, Kerala on 22 February 2020.
  15. GIAN course on Data and Text Mining in Bioinformatics conducted at NIT Calicut from 10 to 14 September 2018.
  16. Summer School on Machine Learning: Deep Learning at IIIT Hyderabad from 10th to 15th July 2017
  17. International Workshop on Digitalization and Data Analytics: Challenges and Opportunities, Department of Computer Science & Engineering, Govt. College of Engineering, Kannur from 11 to 13 March 2017.

Reviewer

  • International Conference on Communication, Control and Information Sciences (ICCISc-2021) is organized by Govt. Engineering College (GEC) Idukki.

Guest Lectures Delivered

  • Understanding Natural Languages Beyond Words, Online Lecture Series, M.Tech Computational Linguistics Alumni Association & Department of Computer Science and Engineering , Government Engineering College Sreekrishnapuram, Palakkad, Kerala

Publications

Publication Type: Conference Paper

Year of Publication Title

2020

Jeena Kleenankandy and Abdul Nazeer K. A., “Recognizing semantic relation in sentence pairs using Tree-RNNs and Typed Dependencies”, in 2020 6th IEEE Congress on Information Science and Technology (CiSt), 2020, pp. 372–377.(The Best Paper Award Winner), 2020.[Abstract]


Recursive neural networks (Tree-RNNs) based on dependency trees are ubiquitous in modeling sentence meanings as they effectively capture semantic relationships between non-neighborhood words. However, recognizing semantically dissimilar sentences with the same words and syntax is still a challenge to Tree-RNNs. This work proposes an improvement to Dependency Tree-RNN (DT-RNN) using the grammatical relationship type identified in the dependency parse. Our experiments on semantic relatedness scoring (SRS) and recognizing textual entailment (RTE) in sentence pairs using SICK (Sentence Involving Compositional Knowledge) dataset show encouraging results. The model achieved a 2% improvement in classification accuracy for the RTE task over the DT-RNN model. The results show that Pearson's and Spearman's correlation measures between the model's predicted similarity scores and human ratings are higher than those of standard DT-RNNs.

More »»

2020

Jeena Kleenankandy, “Deep Learning Models for Natural Language Processing: An Overview.”, in Virtual Conference On Computing and Informatics, pp. 48-52. ISBN: 978-93-5437-694-8, 2020. (Invited paper), 2020.

2019

k. Maheshwari, Jeena Kleenankandy, and Abdul Nazeer K. A., “Identifying synonyms in bigram phrases using Compositional DSM and Artificial Neural Network”, in International Conference on Computing, Power and Communication Technologies (GUCON), NCR New Delhi, India, 2019, pp. 827-831., 2019.[Abstract]


This paper focuses on identifying whether a bi-gram is a synonym to a given single word, using compositional distributional semantic and Artificial Neural Networks. Understanding the semantics of the natural language text has been a challenging task in natural language processing. In DSM(distributional semantic model) of meaning representation, the words are represented as vectors in multi-dimensional vector space such that words that are semantically similar appear closer to each other whereas words that are dissimilar appear farther apart. This work tries to represent the meaning of bigram in this vector space. Many research works in Distributional semantic try to represent word-pair of the form Adjective-Noun, Verb-Noun, Noun-Noun in multi-dimensional space along with the individual elements of phrases. And these built representations are used for identifying the synonyms of them. Researchers have been successful in representing semantic of words but efficient ways to represent that of phrases are still being investigated. Our work aims to achieve a more efficient method in identifying semantically similar word for the bigram phrase. In this paper, we propose a Neural Network based model which predicts a compositional vector for two-word phrases from its constituent word vectors. The compositional vectors have been evaluated using the classifier to identify synonym word-pairs. PPDB (Paraphrase Database) data set has been used for the experiment. Our model performed better than other non-neural network based models for identifying bigram-unigram synonym pairs. More »»

2014

Jeena Kleenankandy, “Implementation of Sandhi-rule based compound word generator for Malayalam.”, in 2014 Fourth International Conference on Advances in Computing and Communications, pp. 134-137. IEEE, 2014., 2014.[Abstract]


Malayalam is one of the prominent regional languages of Indian subcontinent spoken by over 35 million people. Being an agglutinative language, word generation in Malayalam Language is really challenging. This paper discusses about the sandhi rules for generating affixed and compound words from their components. The challenges faced in the design of the system are also discussed. More »»

Publication Type: Journal Article

Year of Publication Title

2020

Jeena Kleenankandy and Abdul Nazeer K. A., “An enhanced Tree-LSTM architecture for sentence semantic modeling using typed dependencies”, Information Processing & Management, vol. 57, p. 102362, 2020.[Abstract]


Background: Tree-based Long Short Term Memory (LSTM) network has become state-of-the-art for modeling the meaning of language texts as they can effectively exploit the grammatical syntax and thereby non-linear dependencies among words of the sentence. However, most of these models cannot recognize the difference in meaning caused by a change in semantic roles of words or phrases because they do not acknowledge the type of grammatical relations, also known as typed dependencies, in sentence structure. Methods: This paper proposes an enhanced LSTM architecture, called relation gated LSTM, which can model the relationship between two inputs of a sequence using a control input. We also introduce a Tree-LSTM model called Typed Dependency Tree-LSTM that uses the sentence dependency parse structure as well as the dependency type to embed sentence meaning into a dense vector. Results: The proposed model outperformed its type-unaware counterpart in two typical Natural Language Processing (NLP) tasks – Semantic Relatedness Scoring and Sentiment Analysis. The results were comparable or competitive with other state-of-the-art models. Qualitative analysis showed that changes in the voice of sentences had little effect on the model’s predicted scores, while changes in nominal (noun) words had a more significant impact. The model recognized subtle semantic relationships in sentence pairs. The magnitudes of learned typed dependency embeddings were also in agreement with human intuitions. Conclusion: The research findings imply the significance of grammatical relations in sentence modeling. The proposed models would serve as a base for future researches in this direction.

More »»