Use Cursor with WaveSpeed LLM: Custom API Endpoint Guide
Point Cursor at WaveSpeed LLM for Claude Opus 4.6, GPT-5.2, Gemini 3, and 290+ other models. Full configuration walkthrough — endpoint, API key, model ID.
Your Cursor, Your Models
Cursor ships with its own hosted models by default, but every serious Cursor user eventually wants to bring their own keys — to save money, to unlock a specific model, or to route requests through an internal gateway. WaveSpeed LLM solves all three at once: one OpenAI-compatible endpoint, 290+ models, per-token billing, no subscription lock-in.
This guide walks through the exact Cursor settings you need.
What You’ll Configure
| Field | Value |
|---|---|
| Provider type | OpenAI (OpenAI-compatible) |
| Base URL / Override | https://llm.wavespeed.ai/v1 |
| API Key | Your WaveSpeed API key |
| Model ID | vendor/model — e.g. anthropic/claude-opus-4.6 |
Step 1: Get Your WaveSpeed API Key
- Sign in at wavespeed.ai.
- Open Dashboard → API Keys.
- Create and copy a new key.
Step 2: Open Cursor Model Settings
- Launch Cursor.
- Open Settings (Cmd/Ctrl + Shift + J).
- Go to Models.
You’ll see a list of default Cursor-hosted models and a section for custom providers.
Step 3: Add WaveSpeed as a Custom OpenAI Endpoint
-
Scroll to OpenAI API Key (or equivalent custom provider section).
-
Paste your WaveSpeed API key into the OpenAI API Key field.
-
Click Override OpenAI Base URL (or similar toggle).
-
Paste the base URL:
https://llm.wavespeed.ai/v1 -
Click Verify / Save. Cursor will send a test request.
If the verification succeeds, your key and base URL are set correctly.
Step 4: Add WaveSpeed Model IDs
Cursor lets you add custom model names under the OpenAI provider. Add the WaveSpeed model IDs you want to use:
anthropic/claude-opus-4.6anthropic/claude-sonnet-4.6openai/gpt-5.2-proopenai/gpt-5.2-chatgoogle/gemini-3-flash-previewdeepseek/deepseek-v4
Save. These will now appear in Cursor’s model picker in the chat sidebar and inline edit UI.
Step 5: Use It
Open any project in Cursor, press Cmd/Ctrl + L to open the chat, and pick a WaveSpeed model from the dropdown. Ask it a question about your codebase. You should see a normal streaming response — Cursor is now routing through WaveSpeed.
Inline edits (Cmd/Ctrl + K) and tab completion will also use your selected model where supported.
Recommended Models for Cursor
| Model ID | Strength |
|---|---|
anthropic/claude-opus-4.6 | Best long-context reasoning and refactoring |
anthropic/claude-sonnet-4.6 | Fast Claude for everyday edits |
openai/gpt-5.2-pro | Strong reasoning alternative |
openai/gpt-5.2-chat | Balanced speed/quality |
deepseek/deepseek-v4 | Cheapest strong coder |
Try a few on the same task and pick your favorite.
Troubleshooting
“Invalid OpenAI API Key” The WaveSpeed API key goes into Cursor’s OpenAI API Key field. The Base URL override tells Cursor where to send it.
“Model not found”
Cursor sends whatever model string you selected directly to the endpoint. It must include the vendor prefix — anthropic/claude-opus-4.6, not claude-opus-4.6.
Verification fails with network error
Double-check the Base URL is exactly https://llm.wavespeed.ai/v1 — trailing slashes, missing v1, or using api.wavespeed.ai will all fail.
Some Cursor features don’t work with a custom endpoint A few advanced Cursor features (like Cursor’s own indexing-based features) rely on Cursor-hosted models and won’t use your custom provider. Chat, inline edits, and completions all work normally.
Why This Setup Is Worth It
- Save money on heavy days. Run big refactors on
openai/gpt-5.2-prothrough WaveSpeed per-token pricing instead of burning through a fixed Cursor subscription quota. - Access models Cursor doesn’t host. Qwen 3, DeepSeek V4, Llama 4, Grok 4 — anything in the WaveSpeed catalog.
- One bill. Consolidate your Cursor usage, your API scripts, and your chat apps onto a single WaveSpeed account.
Start Coding Today
Two fields — Base URL and API Key — and you’re running Cursor on 290+ models.
Get your WaveSpeed API key and point Cursor at https://llm.wavespeed.ai/v1 in minutes.

