🧀 BigCheese.ai

Social

Go-attention: A full attention mechanism and transformer in pure Go

🧀

go-attention is the first pure Go implementation of attention mechanisms and transformer layers, designed for high performance and ease of use. It supports dot-product attention, multi-head attention, and full transformer layers with optimization for the CPU and minimal memory allocations.

  • Pure Go library
  • Dot-product & multi-head attention
  • Transformer layers
  • Zero dependencies
  • MIT Licensed