Spiking Transformer: Introducing Accurate Addition-Only Spiking Self-Attention for Transformer | Read Paper on Bytez