Abstract
Transformers are considered conceptually different from the previous generation of state-of-the-art NLP models-recurrent neural networks (RNNs).In this work, we demonstrate that decoder-only transformers can in fact be conceptualized as unbounded multi-state RNNs-an RNN variant with unlimited hidden state size.We further show that transformers can be converted into bounded multi-state RNNs by fixing the size of their hidden state, effectively compressing their key-value cache.We introduce a novel, training-free compression policy-Token Omission Via Attention (TOVA). Our experiments with four long range tasks and several LLMs show that TOVA outperforms several baseline compression policies.Particularly, our results are nearly on par with the full model, using in some cases only 1/8 of the original cache size, which translates to 4.8X higher throughput.Our results shed light on the connection between transformers and RNNs, and help mitigate one of LLMs' most painful computational bottlenecks-the size of their key-value cache.
Original language | English |
---|---|
Title of host publication | EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
Editors | Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 18724-18741 |
Number of pages | 18 |
ISBN (Electronic) | 9798891761643 |
State | Published - 2024 |
Event | 2024 Conference on Empirical Methods in Natural Language Processing, EMNLP 2024 - Hybrid, Miami, United States Duration: 12 Nov 2024 → 16 Nov 2024 |
Publication series
Name | EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
---|
Conference
Conference | 2024 Conference on Empirical Methods in Natural Language Processing, EMNLP 2024 |
---|---|
Country/Territory | United States |
City | Hybrid, Miami |
Period | 12/11/24 → 16/11/24 |
Bibliographical note
Publisher Copyright:© 2024 Association for Computational Linguistics.