AI Gateway Integration

AI Gateway can be used as a centralized intermediary backend for various AI-powered applications.

Instead of connecting applications directly to individual AI providers, AI Gateway acts as a unified access layer that routes requests, enforces usage policies, and provides observability.

This approach allows organizations to manage AI usage consistently across different tools and environments.

Prerequisites

Before you begin, make sure you have:

Authentication

All requests to AI Gateway require authentication using your API key.

Include the API key in the request header:

Authorization: Bearer YOUR_API_KEY

Unified API Endpoint

AI Gateway exposes a unified API endpoint for all supported models.

Supported Application Types

AI Gateway can be integrated with a wide range of third-party applications that support custom API endpoints and API keys, including:

AI Chat Applications

End-user chat interfaces for interacting with large language models.

  • ChatGPT-web-midjourney-proxy

  • NextChat

  • Lobe Chat

  • ChatGPT Friend (uTools Plugin)

  • Chatbox

  • Cherry Studio

  • ChatWise

AI-Powered IDEs

Development environments with built-in AI capabilities for coding assistance.

  • Cursor

AI Coding Assistants

Standalone AI tools designed specifically for code understanding, generation, and refactoring.

  • Claude Code

  • Continue

  • OpenAI Codex

  • OpenCode

AI Command-Line Tools

CLI-based tools that enable developers to interact with AI models from the terminal.

  • Gemini CLI

Internal Tools and Custom Applications

In-house tools or custom-built applications that require AI capabilities via a unified API backend.

Most of these applications require only minimal configuration—such as setting an API base URL and API key—to route all AI requests through AI Gateway.

Last updated