International Journal of Science and Research (IJSR)

International Journal of Science and Research (IJSR)
Call for Papers | Fully Refereed | Open Access | Double Blind Peer Reviewed

ISSN: 2319-7064




Downloads: 0 | Views: 110

Informative Article | Computer Science | India | Volume 11 Issue 9, September 2022 | Rating: 4.5 / 10


A Comprehensive Exploration of Sentence Embedding Models in Diverse NLP Applications

Akshata Upadhye [7]


Abstract: The field of natural language processing (NLP) has witnessed a transformative phase during the recent years which has led to significant developments in sentence embed- ding techniques. This survey aims to explore the key advancements in the contextualized sentence representations, focusing on both transformer-based models, including BERT, DistillBERT, RoBERTa, and XLNet, and LSTM-based models, such as ELMo, InferSent, and SBERT. This research dives deep into the his- torical context, motivations, and applications of these models, and dives deep to provide a comparative analysis that highlights their performance across various NLP tasks. The survey serves as a comprehensive guide for researchers, practitioners, and enthusiasts, providing insights into the strengths, weaknesses, and considerations associated with each model. With a focus on performance, efficiency, and task-specific adaptability, this survey offers a detail analysis of various language models in the context of sentence embeddings for better understanding the dynamic intersection of language and computation.


Keywords: Sentence embeddings, Natural Language Processing, Transformer models, LSTM-based models, Comparative analysis


Edition: Volume 11 Issue 9, September 2022,


Pages: 1248 - 1251


How to Download this Article?

Type Your Valid Email Address below to Receive the Article PDF Link


Verification Code will appear in 2 Seconds ... Wait

Top