Skip Navigation

MPT-30B: Raising the bar for open-source foundation models

www.mosaicml.com MPT-30B: Raising the bar for open-source foundation models

Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on H100s.

and another commercially viable open-source LLM!

1
1 comments