Unified Chat API
The Unified Chat API provides a single OpenAI-compatible endpoint for all LLM providers, with support for Demeterics tools like Knowledge Engine RAG.
Base URL: https://api.demeterics.com/chat/v1
Overview
Instead of managing separate endpoints for each provider, use the unified /chat/v1 endpoint with provider prefixes in the model name:
# Same endpoint, different providers
curl -X POST https://api.demeterics.com/chat/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key" \
-d '{"model": "groq/llama-3.3-70b-versatile", ...}'
curl -X POST https://api.demeterics.com/chat/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key" \
-d '{"model": "anthropic/claude-sonnet-4-20250514", ...}'
curl -X POST https://api.demeterics.com/chat/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key" \
-d '{"model": "openai/gpt-4o", ...}'
Supported Providers
| Provider | Model Prefix | Example |
|---|---|---|
| Groq | groq/ |
groq/llama-3.3-70b-versatile |
| OpenAI | openai/ |
openai/gpt-4o |
| Anthropic | anthropic/ |
anthropic/claude-sonnet-4-20250514 |
google/ or gemini/ |
google/gemini-2.5-flash |
|
| xAI Grok | xai/ |
xai/grok-4-1-fast-non-reasoning |
| OpenRouter | openrouter/ |
openrouter/deepseek/deepseek-chat |
Endpoints
| Endpoint | Method | Description |
|---|---|---|
/chat/completions |
POST | Chat completions (all providers) |
/responses |
POST | Responses API (all providers) |
/models |
GET | List available models |
/health |
GET | Health check |
Basic Usage
Python (OpenAI SDK)
from openai import OpenAI
client = OpenAI(
base_url="https://api.demeterics.com/chat/v1",
api_key="dmt_your_api_key"
)
# Use ANY provider with the same code
response = client.chat.completions.create(
model="anthropic/claude-sonnet-4-20250514",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
Node.js (OpenAI SDK)
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.demeterics.com/chat/v1',
apiKey: 'dmt_your_api_key'
});
const response = await client.chat.completions.create({
model: 'groq/llama-3.3-70b-versatile',
messages: [{ role: 'user', content: 'Hello!' }]
});
cURL
curl -X POST https://api.demeterics.com/chat/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"model": "google/gemini-2.5-flash",
"messages": [
{"role": "user", "content": "What is the capital of France?"}
]
}'
Demeterics Tools
The Unified Chat API supports Demeterics Tools — integrated capabilities that extend LLM functionality with retrieval, search, and other features.
Enable Tools with demeterics_tools
Add the demeterics_tools field to your request to enable specific tools:
{
"model": "groq/llama-3.3-70b-versatile",
"messages": [...],
"demeterics_tools": {
"knowledge": true
}
}
Available Tools
| Tool | Status | Description |
|---|---|---|
knowledge |
Available | RAG with your Knowledge Engine project |
tavily_search |
Coming Soon | Web search via Tavily API |
tavily_extract |
Coming Soon | Web content extraction |
weather |
Coming Soon | Weather lookup via OpenWeatherMap |
Knowledge Tools
Feature Access Required: Knowledge Engine requires whitelisted access.
To request access, email sales@demeterics.com with:
- Subject: "Feature Access Request"
- Feature name: "Knowledge Engine"
Prerequisites
- Whitelisted account: Request Knowledge Engine access from sales@demeterics.com
- Knowledge Project: Create and populate a Knowledge Project with documents
- API Key Configuration: Attach the Knowledge Project to your API key in Settings
Attaching a Knowledge Project to an API Key
- Go to Settings → API Keys
- Select the API key you want to configure
- Under Knowledge Project, select your project from the dropdown
- Click Save
Basic Knowledge Request
Once configured, enable knowledge with "knowledge": true:
curl -X POST https://api.demeterics.com/chat/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"model": "groq/llama-3.3-70b-versatile",
"messages": [
{"role": "user", "content": "What is our return policy?"}
],
"demeterics_tools": {
"knowledge": true
}
}'
Knowledge Configuration Options
Pass an object instead of true for advanced configuration:
{
"model": "groq/llama-3.3-70b-versatile",
"messages": [...],
"demeterics_tools": {
"knowledge": {
"max_iterations": 5
}
}
}
| Option | Type | Default | Description |
|---|---|---|---|
max_iterations |
int | 3 | Maximum tool-calling iterations |
How It Works
When demeterics_tools.knowledge is enabled:
-
The LLM receives four knowledge tools:
search_knowledge— Vector similarity searchfind_documents— Discover relevant documentsget_summary— Get topic summariesget_content— Get full document content
-
The LLM decides which tools to call based on the user's question
-
Tool results are injected into the conversation
-
The LLM generates a response grounded in your documents
Example tool-calling flow:
User: "What is the return policy for electronics?"
→ LLM calls: search_knowledge(query="return policy electronics")
← Returns: Relevant passages from your documents
→ LLM generates response using retrieved context
← Response: "Based on our policies, electronics can be returned within 30 days..."
Python Example with Knowledge
from openai import OpenAI
client = OpenAI(
base_url="https://api.demeterics.com/chat/v1",
api_key="dmt_your_api_key"
)
response = client.chat.completions.create(
model="groq/llama-3.3-70b-versatile",
messages=[
{"role": "system", "content": "You are a helpful assistant. Use the knowledge tools to answer questions about our products and policies."},
{"role": "user", "content": "What is your shipping policy?"}
],
extra_body={
"demeterics_tools": {
"knowledge": True
}
}
)
print(response.choices[0].message.content)
Node.js Example with Knowledge
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.demeterics.com/chat/v1',
apiKey: 'dmt_your_api_key'
});
const response = await client.chat.completions.create({
model: 'groq/llama-3.3-70b-versatile',
messages: [
{ role: 'system', content: 'You are a helpful assistant. Use knowledge tools to answer questions.' },
{ role: 'user', content: 'What are your business hours?' }
],
// @ts-ignore - custom field
demeterics_tools: {
knowledge: true
}
});
cURL Example with Knowledge
curl -X POST https://api.demeterics.com/chat/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"model": "anthropic/claude-sonnet-4-20250514",
"messages": [
{"role": "system", "content": "Answer questions using the knowledge base."},
{"role": "user", "content": "How do I request a refund?"}
],
"demeterics_tools": {
"knowledge": {
"max_iterations": 3
}
}
}'
Response Format
Responses follow the standard OpenAI chat completion format:
{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"created": 1704067200,
"model": "llama-3.3-70b-versatile",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "The capital of France is Paris."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 15,
"completion_tokens": 8,
"total_tokens": 23
}
}
Error Responses
Tool Access Denied
{
"error": {
"message": "tool 'knowledge' access denied: no knowledge project attached to this API key. Configure one in the API Keys settings.",
"type": "tool_access_denied",
"code": "tool_access_denied"
}
}
Solution: Attach a Knowledge Project to your API key in Settings.
Feature Not Enabled
{
"error": {
"message": "tool 'knowledge' access denied: Knowledge Engine feature not enabled for this user. Contact support to enable.",
"type": "tool_access_denied",
"code": "tool_access_denied"
}
}
Solution: Request Knowledge Engine access from sales@demeterics.com.
Invalid Tool Configuration
{
"error": {
"message": "Invalid tool configuration: invalid knowledge config: max_iterations must be positive",
"type": "invalid_tools",
"code": "invalid_tools"
}
}
Solution: Check the demeterics_tools configuration format.
Best Practices
System Prompts for Knowledge
Include instructions for when to use knowledge tools:
{
"messages": [
{
"role": "system",
"content": "You are a customer support agent for Acme Corp. Use the knowledge tools to answer questions about our products, policies, and procedures. If you cannot find relevant information, say so clearly and offer to connect the customer with a human agent."
},
{"role": "user", "content": "How do I track my order?"}
],
"demeterics_tools": {"knowledge": true}
}
Temperature Settings
For factual retrieval, use lower temperature:
{
"model": "groq/llama-3.3-70b-versatile",
"temperature": 0.3,
"messages": [...],
"demeterics_tools": {"knowledge": true}
}
Handling No Results
Instruct the LLM what to do when knowledge search returns no results:
{
"messages": [
{
"role": "system",
"content": "If the knowledge search returns no relevant results, respond with: 'I don't have specific information about that in my knowledge base. Please contact support@example.com for assistance.'"
}
]
}
Future Tools
The demeterics_tools field is designed for extensibility. Coming soon:
Tavily Search (Web Search)
{
"demeterics_tools": {
"tavily_search": {
"search_depth": "advanced",
"max_results": 5,
"include_domains": ["example.com"]
}
}
}
Tavily Extract (Web Content)
{
"demeterics_tools": {
"tavily_extract": {
"urls": ["https://example.com/page"]
}
}
}
Weather
{
"demeterics_tools": {
"weather": {
"units": "metric"
}
}
}
Combining Tools
Multiple tools can be enabled simultaneously:
{
"demeterics_tools": {
"knowledge": true,
"tavily_search": {"search_depth": "basic"},
"weather": true
}
}
Related Documentation
- Knowledge Engine — Full Knowledge Engine documentation
- AI Chat Widget — Widget integration with Knowledge
- API Reference — Provider-specific endpoints
- Getting Started — Quick start guide