Longterm Wiki

Mixtral 8x7B

Mistral AI

Mixtral 8x7B was released December 11, 2023 as an open-weight sparse mixture-of-experts model. Used 8 expert networks with 7B parameters each (46.7B total, 12.9B active per token). Matched or outperformed Llama 2 70B and GPT-3.5 on most benchmarks while being 6x faster at inference. Scored 70.6% on MMLU. Released under Apache 2.0 license.

Developer
Mistral AI
Released
2023-12-11
Context Window
32K tokens

Benchmarks1

BenchmarkScore
MMLU70.6%

Mistral Family1

ModelTierReleasedInput $/MTok
Mistral Large 2large2024-07-24$2

Details

Model FamilyMistral
Generationmixtral
Release Date2023-12-11
Context Window32K tokens

Capabilities1

tool-use

Sources1

Tags

mistralmixture-of-expertsopen-weight