Search
NEWS

MPT-30B: Raising the bar for open-source foundation models

By A Mystery Man Writer

Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on NVIDIA H100 Tensor Core GPUs.

MPT-30B: Raising the bar for open-source foundation models

maddes8cht/mosaicml-mpt-30b-chat-gguf · Hugging Face

MPT-30B: Raising the bar for open-source foundation models

The History of Open-Source LLMs: Better Base Models (Part Two), by Cameron R. Wolfe, Ph.D.

MPT-30B: Raising the bar for open-source foundation models

Train Faster & Cheaper on AWS with MosaicML Composer

MPT-30B: Raising the bar for open-source foundation models

2309.13322] From Text to Source: Results in Detecting Large Language Model-Generated Content

MPT-30B: Raising the bar for open-source foundation models

MosaicML Just Released Their MPT-30B Under Apache 2.0. - MarkTechPost

MPT-30B: Raising the bar for open-source foundation models

Guide Of All Open Sourced Large Language Models(LLMs), by Luv Bansal

MPT-30B: Raising the bar for open-source foundation models

The Code4Lib Journal – Searching for Meaning Rather Than Keywords and Returning Answers Rather Than Links

MPT-30B: Raising the bar for open-source foundation models

MPT-30B: Raising the bar for open-source foundation models

MPT-30B: Raising the bar for open-source foundation models

The History of Open-Source LLMs: Better Base Models (Part Two)