LLaMA 2 7B Chat (Hugging Face)
Meta’s 7B-parameter instruction-tuned model optimized for chat, dialogue, and assistant-style applications.
Meta’s 7B-parameter instruction-tuned model optimized for chat, dialogue, and assistant-style applications.
A next-generation 8-billion-parameter open-weight language model from Meta, optimized for reasoning and general-purpose tasks.
A powerful sparse Mixture-of-Experts (MoE) instruction-tuned language model by Mistral AI, combining efficiency and performance for chat and task-oriented generation.
A 32-billion-parameter large language model developed by Qwen LM, designed to deliver high-quality instruction following and multilingual chat capabilities.