Home Explore Blog CI



zed

8th chunk of `docs/src/ai/configuration.md`
98707f52858c351712d61d4fcbd006e09c6c3cdca9484ac60000000100000f28
OpenRouter provides access to multiple AI models through a single API. It supports tool use for compatible models.

1. Visit [OpenRouter](https://openrouter.ai) and create an account
2. Generate an API key from your [OpenRouter keys page](https://openrouter.ai/keys)
3. Open the settings view (`agent: open configuration`) and go to the OpenRouter section
4. Enter your OpenRouter API key

The OpenRouter API key will be saved in your keychain.

Zed will also use the `OPENROUTER_API_KEY` environment variable if it's defined.

### OpenAI API Compatible {#openai-api-compatible}

Zed supports using OpenAI compatible APIs by specifying a custom `endpoint` and `available_models` for the OpenAI provider.

#### X.ai Grok

Example configuration for using X.ai Grok with Zed:

```json
  "language_models": {
    "openai": {
      "api_url": "https://api.x.ai/v1",
      "available_models": [
        {
          "name": "grok-beta",
          "display_name": "X.ai Grok (Beta)",
          "max_tokens": 131072
        }
      ],
      "version": "1"
    },
  }
```

## Advanced Configuration {#advanced-configuration}

### Custom Provider Endpoints {#custom-provider-endpoint}

You can use a custom API endpoint for different providers, as long as it's compatible with the provider's API structure.
To do so, add the following to your `settings.json`:

```json
{
  "language_models": {
    "some-provider": {
      "api_url": "http://localhost:11434"
    }
  }
}
```

Where `some-provider` can be any of the following values: `anthropic`, `google`, `ollama`, `openai`.

### Default Model {#default-model}

Zed's hosted LLM service sets `claude-sonnet-4` as the default model.
However, you can change it either via the model dropdown in the Agent Panel's bottom-right corner or by manually editing the `default_model` object in your settings:

```json
{
  "agent": {
    "version": "2",
    "default_model": {
      "provider": "zed.dev",
      "model": "gpt-4o"
    }
  }
}
```

### Feature-specific Models {#feature-specific-models}

If a feature-specific model is not set, it will fall back to using the default model, which is the one you set on the Agent Panel.

You can configure the following feature-specific models:

- Thread summary model: Used for generating thread summaries
- Inline assistant model: Used for the inline assistant feature
- Commit message model: Used for generating Git commit messages

Example configuration:

```json
{
  "agent": {
    "version": "2",
    "default_model": {
      "provider": "zed.dev",
      "model": "claude-sonnet-4"
    },
    "inline_assistant_model": {
      "provider": "anthropic",
      "model": "claude-3-5-sonnet"
    },
    "commit_message_model": {
      "provider": "openai",
      "model": "gpt-4o-mini"
    },
    "thread_summary_model": {
      "provider": "google",
      "model": "gemini-2.0-flash"
    }
  }
}
```

### Alternative Models for Inline Assists {#alternative-assists}

You can configure additional models that will be used to perform inline assists in parallel.
When you do this, the inline assist UI will surface controls to cycle between the alternatives generated by each model.

The models you specify here are always used in _addition_ to your [default model](#default-model).
For example, the following configuration will generate two outputs for every assist.
One with Claude 3.7 Sonnet, and one with GPT-4o.

```json
{
  "agent": {
    "default_model": {
      "provider": "zed.dev",
      "model": "claude-sonnet-4"
    },
    "inline_alternatives": [
      {
        "provider": "zed.dev",
        "model": "gpt-4o"
      }
    ],
    "version": "2"
  }
}
```

## Default View

Use the `default_view` setting to set change the default view of the Agent Panel.
You can choose between `thread` (the default) and `text_thread`:

```json
{
  "agent": {
    "default_view": "text_thread".
  }
}
```

Title: Configuring OpenRouter, OpenAI-Compatible APIs, Custom Provider Endpoints, Default Models, and Feature-Specific Models in Zed
Summary
This section outlines advanced configurations for language models and the Agent Panel in Zed. It covers setting up OpenRouter, utilizing OpenAI-compatible APIs like X.ai Grok, and defining custom API endpoints for providers like Anthropic, Google, Ollama, and OpenAI. The guide details how to set a default model and configure feature-specific models for thread summaries, inline assistance, and commit message generation. It also explains how to configure alternative models for inline assists and set the default view of the Agent Panel to either 'thread' or 'text_thread'.