Home Explore Blog CI



zed

7th chunk of `docs/src/ai/configuration.md`
fcf5fd77fe8ffd1bb0021b865b777425a9a34378cbae68760000000100000b57
The `supports_images` option enables the model’s vision capabilities, allowing it to process images included in the conversation context.  
If the model is tagged with `vision` in the Ollama catalog, set this option and you can use it in zed.

### OpenAI {#openai}

> ✅ Supports tool use

1. Visit the OpenAI platform and [create an API key](https://platform.openai.com/account/api-keys)
2. Make sure that your OpenAI account has credits
3. Open the settings view (`agent: open configuration`) and go to the OpenAI section
4. Enter your OpenAI API key

The OpenAI API key will be saved in your keychain.

Zed will also use the `OPENAI_API_KEY` environment variable if it's defined.

#### Custom Models {#openai-custom-models}

The Zed Assistant comes pre-configured to use the latest version for common models (GPT-3.5 Turbo, GPT-4, GPT-4 Turbo, GPT-4o, GPT-4o mini).
To use alternate models, perhaps a preview release or a dated model release, or if you wish to control the request parameters, you can do so by adding the following to your Zed `settings.json`:

```json
{
  "language_models": {
    "openai": {
      "available_models": [
        {
          "name": "gpt-4o-2024-08-06",
          "display_name": "GPT 4o Summer 2024",
          "max_tokens": 128000
        },
        {
          "name": "o1-mini",
          "display_name": "o1-mini",
          "max_tokens": 128000,
          "max_completion_tokens": 20000
        }
      ],
      "version": "1"
    }
  }
}
```

You must provide the model's Context Window in the `max_tokens` parameter; this can be found in the [OpenAI model documentation](https://platform.openai.com/docs/models).
OpenAI `o1` models should set `max_completion_tokens` as well to avoid incurring high reasoning token costs.
Custom models will be listed in the model dropdown in the Agent Panel.

### OpenRouter {#openrouter}

> ✅ Supports tool use

OpenRouter provides access to multiple AI models through a single API. It supports tool use for compatible models.

1. Visit [OpenRouter](https://openrouter.ai) and create an account
2. Generate an API key from your [OpenRouter keys page](https://openrouter.ai/keys)
3. Open the settings view (`agent: open configuration`) and go to the OpenRouter section
4. Enter your OpenRouter API key

The OpenRouter API key will be saved in your keychain.

Zed will also use the `OPENROUTER_API_KEY` environment variable if it's defined.

### OpenAI API Compatible {#openai-api-compatible}

Zed supports using OpenAI compatible APIs by specifying a custom `endpoint` and `available_models` for the OpenAI provider.

#### X.ai Grok

Example configuration for using X.ai Grok with Zed:

```json
  "language_models": {
    "openai": {
      "api_url": "https://api.x.ai/v1",
      "available_models": [
        {
          "name": "grok-beta",
          "display_name": "X.ai Grok (Beta)",
          "max_tokens": 131072

Title: Configuring OpenAI, Custom Models, OpenRouter, and OpenAI-Compatible APIs in Zed
Summary
This section details how to configure OpenAI, custom models, OpenRouter, and OpenAI-compatible APIs within Zed. It explains how to enable vision capabilities for Ollama models by setting `supports_images`, creating an OpenAI API key, and configuring custom OpenAI models with specific parameters like `max_tokens`. The guide covers setting up OpenRouter by creating an account and entering the API key in Zed. Finally, it explains how to use OpenAI-compatible APIs by specifying a custom endpoint and available models, using X.ai Grok as an example.