mixtral
Mixtral 8x22B: Cheaper, Better, Faster, Stronger
mistral.ai
Mixtral 8x22B is our latest open model. It sets a new standard for performance and efficiency within the AI community. It is a sparse Mixture-of-Experts (SMoE) model that uses only 39B active parameters out of 141B, offering unparalleled cost efficiency for its size.
Mixtral 8x22B comes with the fol...
mixtral llm llama mistral chatgpt
17 квітня 2024 · 0 · 2 · 2 · Alex