Mixtral 8x7B Instruct v0.1
A powerful sparse Mixture-of-Experts (MoE) instruction-tuned language model by Mistral AI, combining efficiency and performance for chat and task-oriented generation.
A powerful sparse Mixture-of-Experts (MoE) instruction-tuned language model by Mistral AI, combining efficiency and performance for chat and task-oriented generation.
A compact, high-quality small language model from Microsoft designed for strong reasoning, on-device inference, and low-latency applications.