Effortless and beneficial processing of natural languages using transformers
- Title
- Effortless and beneficial processing of natural languages using transformers
- Creator
- Amrutha K.; Prabu P.
- Description
- Natural Language Processing plays a vital role in our day-to-day life. Deep learning models for NLP help make human life easier as computers can think, talk, and interact like humans. Applications of the NLP models can be seen in many domains, especially in machine translation and psychology. This paper briefly reviews the different transformer models and the advantages of using an Encoder-Decoder language translator model. The article focuses on the need for sequence-to-sequence language-translation models like BERT, RoBERTa, and XLNet, along with their components. 2022 Taru Publications.
- Source
- Journal of Discrete Mathematical Sciences and Cryptography, Vol-25, No. 7, pp. 1987-2005.
- Date
- 2022-01-01
- Publisher
- Taylor and Francis Ltd.
- Subject
- 68T05; 68T50; ALBERTA; Attention model; BERT; GPT3; NLP; RoBERTa; Sequential model encoder-decoder
- Coverage
- Amrutha K., Department of Computer Science, CHRIST (Deemed to be University), Karnataka, Bengaluru, India; Prabu P., Department of Computer Science, CHRIST (Deemed to be University), Karnataka, Bengaluru, India
- Rights
- Restricted Access
- Relation
- ISSN: 9720529
- Format
- Online
- Language
- English
- Type
- Article
Collection
Citation
Amrutha K.; Prabu P., “Effortless and beneficial processing of natural languages using transformers,” CHRIST (Deemed To Be University) Institutional Repository, accessed February 25, 2025, https://archives.christuniversity.in/items/show/15274.