Skip to main content

Provider Support

Supported providers, API formats, and configuration details.

Overview

Your app authenticates to the gateway using a QuilrAI API key. Provider credentials are configured in the dashboard and never exposed to clients.

Capability Matrix

ProviderChatEmbeddingsTTSSTTModels
OpenAI
Azure OpenAI
Anthropic (Chat Completions)---
DeepSeek---
Gemini (Chat Completions)---
General LLM---
Anthropic (Messages)----
AWS Bedrock (Anthropic)----
Vertex AI----

Chat Completions

Endpoint: /openai_compatible/v1/chat/completions Auth: Authorization: Bearer sk-quilr-xxx

ProviderAuth ModeRequired FieldsOptional Fields
OpenAIAPI Keyapi_key-
Azure OpenAIAPI Keyapi_key, azure_endpointazure_api_version
Anthropic (OpenAI-compatible)API Keyapi_key-
DeepSeekAPI Keyapi_key-
Gemini (OpenAI-compatible)API Keyapi_key-
General LLM (vLLM, Ollama, etc.)API Keyapi_key, base_url-

Anthropic Messages

Endpoint: /anthropic_messages/v1/messages Auth: x-api-key: sk-quilr-xxx

ProviderAuth ModeRequired FieldsOptional Fields
Anthropic (Native Messages API)API Keyapi_key-
AWS Bedrock (Anthropic via Bedrock)AWS Credentialsaws_access_key, aws_secret_keyaws_region, aws_session_token

AWS Bedrock default region: us-east-1

Vertex AI

Endpoint: /vertex_ai/ Auth: Authorization: Bearer sk-quilr-xxx

Vertex AI supports multiple authentication modes. Select the mode when creating the key.

Auth ModeRequired FieldsOptional FieldsNotes
API Keyapi_key, gcp_project_idgcp_regionDefault region: us-central1
Expressapi_key-No project ID needed
Service Accountservice_account_jsongcp_project_id, gcp_regionProject ID derived from JSON if omitted
ADCgcp_project_idgcp_regionApplication Default Credentials from environment

TTS & STT

Endpoints: /openai_compatible/v1/audio/speech and /openai_compatible/v1/audio/transcriptions

ProviderTTSSTTAuth ModeRequired Fields
OpenAIAPI Keyapi_key
Azure OpenAIAPI Keyapi_key, azure_endpoint

STT also supports /v1/audio/translations. Azure deployments use the /openai/deployments/{deployment}/ path prefix.

Responses API

Coming Soon

Support for OpenAI's Responses API format is in development.

SDK

API Endpoint: /sdk/v1/check Auth: Authorization: Bearer sk-quilr-xxx

The SDK provides guardrails-only scanning - no upstream LLM provider needed. Check text for PII, PHI, adversarial prompts, and custom intents without forwarding to any model.

Python

pip install quilrai

JavaScript

npm install quilrai

LiteLLM Proxy Plugin

QuilrAI integrates as a plugin for LiteLLM's proxy gateway. Configure it in your LiteLLM proxy config to add guardrails to all LLM traffic.