How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers

Michael Hassid, Hao Peng, Daniel Rotem, Jungo Kasai, Ivan Montero, Noah A. Smith, Roy Schwartz

Research output: Contribution to conferencePaperpeer-review

9 Scopus citations

Fingerprint

Dive into the research topics of 'How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers'. Together they form a unique fingerprint.

Keyphrases

Computer Science