LobeHub Chat

Common Configuration Parameters

To connect an application to AI Gateway, copy the URL below, replace {APPLICATION_URL} and sk-xxxxxxxxxxxxxx, and open it in your browser.

https://{APPLICATION_URL}/?settings={
  "keyVaults": {
    "openai": {
      "apiKey": "sk-xxxxxxxxxxxxxx",
      "baseURL": "https://gateway.theturbo.ai"
    }
  },
  "languageModel": {
    "openai": {
      "autoFetchModelLists": true,
      "enabled": true,
      "enabledModels": [
        "gpt-4o",
        "gpt-4o-mini",
        "o1-preview",
        "o1-mini",
        "gpt-4o-2024-08-06",
        "gpt-4-turbo",
        "chatgpt-4o-latest",
        "claude-3-5-sonnet-20240620",
        "claude-3-haiku-20240307",
        "claude-3-opus-20240229",
        "claude-3-sonnet-20240229",
        "gemini-1.5-flash-latest",
        "gemini-1.5-pro-latest"
      ]
    },
    "ollama": {
      "enabled": false
    }
  },
  "check_updates": false
}
Parameter
Description

{APPLICATION_URL}

The access URL of the AI application you have deployed (for example, a chat UI or desktop web app)

key

The API key generated after creating an AI Gateway instance. This key is used to authenticate all requests

url

The base URL of your AI Gateway service endpoint, that is https://gateway.theturbo.ai

circle-info

Note

You can view Githubarrow-up-right for more detailed configurations.

Application Configuration – Language Model (OpenAI)

To manually configure the AI Gateway in the application:

  1. Navigate to Application Settings > Language Models > OpenAI.

  2. In the API Key field, enter your AI Gateway API key.

  3. In the Base URL field, enter:

  4. Click Fetch Model List to retrieve the available models from AI Gateway.

  5. From the model list, select and enable the models you want to use.

Once completed, the application will route all model requests through the AI Gateway.

Last updated