Subspace Networks: Scaling Decentralized Training with Communication-Efficient Model Parallelism | Read Paper on Bytez