Meta Llama 3 8B
A next-generation 8-billion-parameter open-weight language model from Meta, optimized for reasoning and general-purpose tasks.
A next-generation 8-billion-parameter open-weight language model from Meta, optimized for reasoning and general-purpose tasks.
A powerful sparse Mixture-of-Experts (MoE) instruction-tuned language model by Mistral AI, combining efficiency and performance for chat and task-oriented generation.
A next-generation vision-language model by Google, combining Gemma LLM and SigLIP vision encoder for image captioning, VQA, and image-text reasoning tasks.
A text-to-image diffusion model from Stability AI featuring improved prompt alignment, style diversity, and compositional reasoning.