Skip to content

Groq

Free

by AI Labs

Ultra-fast LLM inference with LPU chips.

v1.0.0Added Jan 30, 2025
groqinferencefast
Works with:ClaudeGPTGeminiCopilot

Groq MCP Server

Ultra-fast LLM inference with LPU chips.

Features

  • Chat completions
  • Low latency
  • Llama models
  • Mixtral
  • Streaming

Installation

{
  "mcpServers": {
    "groq": {
      "command": "npx",
      "args": ["-y", "mcp-groq"]
    }
  }
}

Reviews

Leave a Review