International Journal of Science and Research (IJSR)

International Journal of Science and Research (IJSR)
Call for Papers | Fully Refereed | Open Access | Double Blind Peer Reviewed

ISSN: 2319-7064


Downloads: 9

India | Computer Science | Volume 11 Issue 9, September 2022 | Pages: 1248 - 1251


A Comprehensive Exploration of Sentence Embedding Models in Diverse NLP Applications

Akshata Upadhye

Abstract: The field of natural language processing (NLP) has witnessed a transformative phase during the recent years which has led to significant developments in sentence embed- ding techniques. This survey aims to explore the key advancements in the contextualized sentence representations, focusing on both transformer-based models, including BERT, DistillBERT, RoBERTa, and XLNet, and LSTM-based models, such as ELMo, InferSent, and SBERT. This research dives deep into the his- torical context, motivations, and applications of these models, and dives deep to provide a comparative analysis that highlights their performance across various NLP tasks. The survey serves as a comprehensive guide for researchers, practitioners, and enthusiasts, providing insights into the strengths, weaknesses, and considerations associated with each model. With a focus on performance, efficiency, and task-specific adaptability, this survey offers a detail analysis of various language models in the context of sentence embeddings for better understanding the dynamic intersection of language and computation.

Keywords: Sentence embeddings, Natural Language Processing, Transformer models, LSTM-based models, Comparative analysis



Rate This Article!



Received Comments

No approved comments available.


Top