Skip to content
Cloudflare Docs

OpenAI Compatibility

Cloudflare's AI Gateway offers an OpenAI-compatible /chat/completions endpoint, enabling integration with multiple AI providers using a single URL. This feature simplifies the integration process, allowing for seamless switching between different models without significant code modifications.

Key benefits

  • Standardization: Provides a unified format compatible with OpenAI's schema, reducing the need for code refactoring.
  • Ease of Development: Switch between different models and providers quickly using a consistent API structure.

Endpoint URL

https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions

Replace {account_id} and {gateway_id} with your Cloudflare account and gateway IDs.

Using the Unified Interface

Switch providers by changing the model and apiKey parameters.

Model Parameter Format

Specify the model using {provider}/{model} format. For example:

  • openai/gpt-4o-mini
  • google-ai-studio/gemini-2.0-flash
  • anthropic/claude-3-haiku

Example with OpenAI JavaScript SDK

import OpenAI from "openai";
const client = new OpenAI({
apiKey: "YOUR_PROVIDER_API_KEY", // Provider API key
baseURL:
"https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions",
});
const response = await client.chat.completions.create({
model: "google-ai-studio/gemini-2.0-flash",
messages: [{ role: "user", content: "What is Cloudflare?" }],
});
console.log(response.choices[0].message.content);

curl Example

Terminal window
curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions \
--header 'Authorization: Bearer {openai_token}' \
--header 'Content-Type: application/json' \
--data '{
"model": "google-ai-studio/gemini-2.0-flash",
"messages": [
{
"role": "user",
"content": "What is Cloudflare?"
}
]
}'

Supported Providers

The OpenAI-compatible endpoint supports models from the following providers: