MPT-30B

A 30-billion-parameter open-source language model from MosaicML — a strong, general-purpose LLM balancing scale, performance, and inference efficiency.

2 min