MosaicML Launches 30B Model — Takes on LLaMA, Falcon and GPT
MosaicML has launched MPT-30B, which founder Naveen Rao claims out-performs both LLaMA and Falcon in certain use cases for enterprise devs.
This model is good, but, honestly, I think the brand new 33B vicuna preview seems to beat it. If you have tried this model and are looking for a good alternative I highly recommend the vicuna model!
LLM wars are have began ...
A good thing
Began, the LLM wars have. *nods, the green raisin does*
It's called "competition".