Downloads: 1 | Views: 218 | Weekly Hits: ⮙1 | Monthly Hits: ⮙1
Informative Article | Computer Science | India | Volume 11 Issue 9, September 2022 | Popularity: 4.8 / 10
A Comprehensive Exploration of Sentence Embedding Models in Diverse NLP Applications
Akshata Upadhye
Abstract: The field of natural language processing (NLP) has witnessed a transformative phase during the recent years which has led to significant developments in sentence embed- ding techniques. This survey aims to explore the key advancements in the contextualized sentence representations, focusing on both transformer-based models, including BERT, DistillBERT, RoBERTa, and XLNet, and LSTM-based models, such as ELMo, InferSent, and SBERT. This research dives deep into the his- torical context, motivations, and applications of these models, and dives deep to provide a comparative analysis that highlights their performance across various NLP tasks. The survey serves as a comprehensive guide for researchers, practitioners, and enthusiasts, providing insights into the strengths, weaknesses, and considerations associated with each model. With a focus on performance, efficiency, and task-specific adaptability, this survey offers a detail analysis of various language models in the context of sentence embeddings for better understanding the dynamic intersection of language and computation.
Keywords: Sentence embeddings, Natural Language Processing, Transformer models, LSTM-based models, Comparative analysis
Edition: Volume 11 Issue 9, September 2022
Pages: 1248 - 1251
DOI: https://www.doi.org/10.21275/SR24304153117
Make Sure to Disable the Pop-Up Blocker of Web Browser