MPT-30B: Raising the bar for open-source foundation models
MPT-30B: Raising the bar for open-source foundation models
www.mosaicml.com MPT-30B: Raising the bar for open-source foundation models
Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on H100s.
and another commercially viable open-source LLM!
5 crossposts
1 comments
Awesome, going to try this one out!
2 0 Reply