Skip to content
Longterm Wiki

Mixtral 8x7B

Mistral AIOpen Weight
Mixtral 8x7B was released December 11, 2023 as an open-weight sparse mixture-of-experts model. Used 8 expert networks with 7B parameters each (46.7B total, 12.9B active per token). Matched or outperformed Llama 2 70B and GPT-3.5 on most benchmarks while being 6x faster at inference. Scored 70.6% on MMLU. Released under Apache 2.0 license.
Developer
Mistral AI
Released
2023-12-11
Context Window
32K tokens

Modality

text

Capabilities1

tool-use

Details

Model FamilyMistral
Generationmixtral
Release Date2023-12-11
Parameters46.7B
Context Window32K tokens
Open WeightYes

Tags

mistralmixture-of-expertsopen-weight