Joint MoE Scaling Laws: Mixture of Experts Can Be Memory Efficient | Read Paper on Bytez