Universal Physics Transformers: A Framework For Efficiently Scaling Neural Operators | Read Paper on Bytez