LLM Group, Institute for Advanced Algorithms Research, Shanghai
Popular repositories Loading
Repositories
Showing 10 of 10 repositories
- Awesome-Attention-Heads Public
The attention heads in the Transformer architecture possess a variety of capabilities. This is a carefully compiled list that summarizes the diverse functions of the attention heads.
IAAR-Shanghai/Awesome-Attention-Heads’s past year of commit activity