Every Month Last Day ISSN (online): 1934-9955

International Journal Of Imaging

Science And Engineering

We welcome original or extended version of previously published papers in conferences and/or journals. Every Month Last Day Volume 11, Issue 127, August edition has been published. Authors are requested to take a look. MÃ¥nedlig webinar: 15 September 2024 - 18:00 GMT +1
Abstract

Attention Mechanisms in Natural Language Processing: A Comprehensive Survey

Author: Kang Leng Chiew

Abstract:

The rise of attention mechanisms has significantly advanced the state of the art in various natural language processing (NLP) tasks, particularly in machine translation, question answering, and text summarization. This survey explores the conceptual evolution, implementation strategies, and comparative performance of attention-based architectures up to 2018. We categorize attention into global, local, self-attention, and hierarchical attention mechanisms, analyzing their integration into encoder-decoder models, recurrent neural networks (RNNs), and more recently, non-recurrent frameworks such as the Transformer. Self-attention, which allows models to weigh relationships between tokens in a sequence regardless of their distance, is shown to offer both computational efficiency and improved long-range dependency handling. The Transformer model, introduced in 2017, marks a paradigm shift by eliminating recurrence altogether while delivering superior performance in tasks like tran

Download PDF
Contents are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Privacy Policy | Terms of Use