Aligner-Encoders: Self-Attention Transformers Can Be Self-Transducers | Read Paper on Bytez