Home Explore Blog CI



zed

4th chunk of `docs/src/ai/configuration.md`
76d446dd737e0d73a8ba33f7d124a3a98a886fae882403a10000000100000fc4
You can configure a model to use [extended thinking](https://docs.anthropic.com/en/docs/about-claude/models/extended-thinking-models) (if it supports it) by changing the mode in your model's configuration to `thinking`, for example:

```json
{
  "name": "claude-sonnet-4-latest",
  "display_name": "claude-sonnet-4-thinking",
  "max_tokens": 200000,
  "mode": {
    "type": "thinking",
    "budget_tokens": 4_096
  }
}
```

### DeepSeek {#deepseek}

> ✅ Supports tool use

1. Visit the DeepSeek platform and [create an API key](https://platform.deepseek.com/api_keys)
2. Open the settings view (`agent: open configuration`) and go to the DeepSeek section
3. Enter your DeepSeek API key

The DeepSeek API key will be saved in your keychain.

Zed will also use the `DEEPSEEK_API_KEY` environment variable if it's defined.

#### Custom Models {#deepseek-custom-models}

The Zed Assistant comes pre-configured to use the latest version for common models (DeepSeek Chat, DeepSeek Reasoner). If you wish to use alternate models or customize the API endpoint, you can do so by adding the following to your Zed `settings.json`:

```json
{
  "language_models": {
    "deepseek": {
      "api_url": "https://api.deepseek.com",
      "available_models": [
        {
          "name": "deepseek-chat",
          "display_name": "DeepSeek Chat",
          "max_tokens": 64000
        },
        {
          "name": "deepseek-reasoner",
          "display_name": "DeepSeek Reasoner",
          "max_tokens": 64000,
          "max_output_tokens": 4096
        }
      ]
    }
  }
}
```

Custom models will be listed in the model dropdown in the Agent Panel. You can also modify the `api_url` to use a custom endpoint if needed.

### GitHub Copilot Chat {#github-copilot-chat}

> ✅ Supports tool use in some cases.
> Visit [the Copilot Chat code](https://github.com/zed-industries/zed/blob/9e0330ba7d848755c9734bf456c716bddf0973f3/crates/language_models/src/provider/copilot_chat.rs#L189-L198) for the supported subset.

You can use GitHub Copilot chat with the Zed assistant by choosing it via the model dropdown in the Agent Panel.

### Google AI {#google-ai}

> ✅ Supports tool use

You can use Gemini 1.5 Pro/Flash with the Zed assistant by choosing it via the model dropdown in the Agent Panel.

1. Go to the Google AI Studio site and [create an API key](https://aistudio.google.com/app/apikey).
2. Open the settings view (`agent: open configuration`) and go to the Google AI section
3. Enter your Google AI API key and press enter.

The Google AI API key will be saved in your keychain.

Zed will also use the `GOOGLE_AI_API_KEY` environment variable if it's defined.

#### Custom Models {#google-ai-custom-models}

By default, Zed will use `stable` versions of models, but you can use specific versions of models, including [experimental models](https://ai.google.dev/gemini-api/docs/models/experimental-models). You can configure a model to use [thinking mode](https://ai.google.dev/gemini-api/docs/thinking) (if it supports it) by adding a `mode` configuration to your model. This is useful for controlling reasoning token usage and response speed. If not specified, Gemini will automatically choose the thinking budget.

Here is an example of a custom Google AI model you could add to your Zed `settings.json`:

```json
{
  "language_models": {
    "google": {
      "available_models": [
        {
          "name": "gemini-2.5-flash-preview-05-20",
          "display_name": "Gemini 2.5 Flash (Thinking)",
          "max_tokens": 1000000,
          "mode": {
            "type": "thinking",
            "budget_tokens": 24000
          }
        }
      ]
    }
  }
}
```

Custom models will be listed in the model dropdown in the Agent Panel.

### LM Studio {#lmstudio}

> ✅ Supports tool use

1. Download and install the latest version of LM Studio from https://lmstudio.ai/download
2. In the app press ⌘/Ctrl + Shift + M and download at least one model, e.g. qwen2.5-coder-7b

   You can also get models via the LM Studio CLI:

Title: Configuring DeepSeek, GitHub Copilot Chat, Google AI, and LM Studio in Zed
Summary
This section details how to configure Zed with DeepSeek, GitHub Copilot Chat, Google AI, and LM Studio. For DeepSeek, API keys can be created on the DeepSeek platform and entered in Zed's settings, with the option to customize models and API endpoints via the `settings.json`. GitHub Copilot Chat can be selected from the model dropdown, with limited tool use support. To use Google AI's Gemini models, an API key from Google AI Studio needs to be entered in Zed's settings. Custom Google AI models, including experimental ones and those with thinking mode enabled, can be configured in the `settings.json`. LM Studio requires downloading and installing the application and at least one model.