Fingerprint
Dive into the research topics of 'How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers'. Together they form a unique fingerprint.- Sort by
- Weight
- Alphabetically
Michael Hassid, Hao Peng, Daniel Rotem, Jungo Kasai, Ivan Montero, Noah A. Smith, Roy Schwartz
Research output: Contribution to conference › Paper › peer-review