
Welcome to AttentionLego’s documentation!¶
AttentionLego is a open-source project to build a vanilla Verilog self-attention of large language model customized accelerator with spatial scalability. Hop in and let’s start the journey!
Github Repo: https://github.com/bonanyan/attentionlego
Tarball of Source Code: link
Detailed manual on arxiv: https://arxiv.org/abs/2401.11459
Contact: Dr. Bonan Yan (bonanyan at pku dot edu dot cn)
Citation¶
@misc{cong2024attentionlego,
title={AttentionLego: An Open-Source Building Block For Spatially-Scalable Large Language Model Accelerator With Processing-In-Memory Technology},
author={Rongqing Cong and Wenyang He and Mingxuan Li and Bangning Luo and Zebin Yang and Yuchao Yang and Ru Huang and Bonan Yan},
year={2024},
eprint={2401.11459},
archivePrefix={arXiv},
primaryClass={cs.AR},
url={https://arxiv.org/abs/2401.11459}
}