Mixtral 8x7B Instruct v0.1

A powerful sparse Mixture-of-Experts (MoE) instruction-tuned language model by Mistral AI, combining efficiency and performance for chat and task-oriented generation.

1 min

OpenAI o1

An advanced, text-only language model from OpenAI with GPT-4-level capabilities, optimized for performance, efficiency, and competitive benchmarks.

1 min

QWQ-32B

A 32-billion-parameter large language model developed by Qwen LM, designed to deliver high-quality instruction following and multilingual chat capabilities.

1 min