Insights
For general press inquiries, reach out to our PR team.

GroqCloud now supports Low-Rank Adaptation (LoRA) fine-tunes, exclusively by request, for our Enterprise tier customers. LoRA enables businesses to deploy adaptations of base models customized to their specific use cases on GroqCloud, offering a more efficient and cost-effective approach to model customization. As a part of this release, we are...

You know Groq runs small models. But did you know we run large models including MoE uniquely well? Here’s why. The Evolution of Advanced Openly-Available LLMs There’s no argument that Artificial intelligence (AI) has exploded, in part because of the advancements in large language models (LLMs). These models have shown...

Simplifying the Complexity of AI Agents with Server-Side Tool Use Large Language Models (LLMs) are powerful but constrained by static training data, lacking the ability to access real-time information or interact dynamically with external environments. We need real-time data, not snapshots from 2023. This limitation has driven the adoption of...