Skip to main content

Mistral

Mistral is a series of open weight models that set the bar for efficiency, and are available for free under Apache 2.0 that allows use of the models anywhere without any restriction. The Mistral-7b-instruct-v0.3 model offers 32k context length, making it an ideal LLM for scenarios that require longer context.

note
ModelParamsContext LengthGQAToken CountKnowledge Cutoff
Mistral 7B Instruct v0.37B32kYesUndisclosed~February 2023
Mistral 7B Instruct v0.27B32kYesUndisclosed~February 2023

Mistral 7B Instruct v3

ModelFunction CallingMMLUGPQAGSM-8KMATHMT-benchMT-bench Pairwise Comparison
WinLossTieWin RateLoss RateAdjusted Win Rate
Mistral 7B Instruct v0.3-62.1030.5853.0712.987.503454720.21250.33750.4375
Rubra Enhanced Mistral 7B Instruct v0.373.57%59.1229.9143.2911.147.695434720.33750.21250.5625

Mistral 7B Instruct v2

ModelFunction CallingMMLUGPQAGSM-8KMATHMT-benchMT-bench Pairwise Comparison
WinLossTieWin RateLoss RateAdjusted Win Rate
Mistral 7B Instruct v0.2-59.2727.6843.2110.307.503454720.21250.33750.4375
Rubra Enhanced Mistral 7B Instruct v0.269.28%58.9029.9134.128.367.365943580.368750.268750.55