Just Change Your Base URL
Keep your existing OpenAI/Anthropic credits. Get full observability with zero code changes. Debug latency issues, track costs, and never miss a failed request again.
- Bring Your Own Keys – Use your provider accounts, pay them directly
- 50+ fields per call – Costs by category, latency breakdown, full conversation history
- Zero friction – Change one URL, instant visibility
100 free credits included. BYOK: 10% platform fee, you pay providers directly.
# Before (standard OpenAI)
client = OpenAI(
api_key="sk-..."
)
# After (with Demeterics observability)
client = OpenAI(
base_url="https://api.demeterics.com/openai/v1",
api_key="dmt_..." # includes your OpenAI key
)
Works with Python, Node.js, Go, Rust, and any HTTP client
Bring Your Own Keys (BYOK)
Keep your existing OpenAI, Anthropic, or Groq accounts. Route traffic through Demeterics, pay providers directly, and add a transparent 10% platform fee for observability.
Two Ways to BYOK
- You pay providers directly
- Full observability in your dashboard
- 10% platform fee for the service
Tag Your Prompts. Zero Overhead.
Add metadata, A/B test variants, and audit notes directly in your prompts using /// NAME VALUE syntax.
We strip these before the LLM sees them — so you pay nothing extra.
/// APP customer-support
/// FLOW ticket-response
/// VARIANT gpt4-concise
/// VERSION 2.3.1
You are a helpful customer support agent.
/// This variant uses shorter responses
/// Approved by legal team 2024-12-15
Respond to this ticket: {{{ ticket_text }}}
Everything You Need. Nothing You Don't.
From debugging to ROI tracking — features that actually matter.
Latency p95
Debug slow callsError Tracking
Rate limits, 500sCost Attribution
By team/workflow/// Tagging
Free metadata tags
Dual-Key Auth
Secure automationBYOK
Your keys + 10% feeCouncil API
18 persona feedbackConversion Track
Link AI to ROIA/B Testing
Prompt variantsData Export
CSV via APIBudget Guards
Per-key limitsZDR Mode
Control what's storedSee It In Action
From black box to glass box: watch how Demeterics turns AI spending into crystal-clear visibility.
Full demo: n8n workflows, cost tracking, multi-agent quality control
Start Free Now Read the DocsIntegration = Change One URL
No SDK to install. No code rewrite. Just change your base URL and you're done.
# pip install openai
from openai import OpenAI
# Before
# client = OpenAI(api_key="sk-...")
# After (with Demeterics observability)
client = OpenAI(
base_url="https://api.demeterics.com/openai/v1",
api_key="dmt_your_key" # includes your OpenAI key
)
response = client.chat.completions.create(
model="gpt-4",
messages=[{
"role": "user",
"content": """/// APP my-app
/// FLOW chat
Hello!"""
}]
)
// npm install openai
import OpenAI from 'openai';
// Before
// const client = new OpenAI({ apiKey: "sk-..." });
// After (with Demeterics observability)
const client = new OpenAI({
baseURL: "https://api.demeterics.com/openai/v1",
apiKey: "dmt_your_key"
});
const response = await client.chat.completions.create({
model: "gpt-4",
messages: [{
role: "user",
content: `/// APP my-app
/// FLOW chat
Hello!`
}]
});
# Just change the URL - that's it!
curl https://api.demeterics.com/openai/v1/chat/completions \
-H "Authorization: Bearer dmt_your_key" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4",
"messages": [{
"role": "user",
"content": "/// APP my-app\n/// FLOW chat\nHello!"
}]
}'
# pip install langchain-openai
from langchain_openai import ChatOpenAI
# Before
# llm = ChatOpenAI(model="gpt-4", api_key="sk-...")
# After (with Demeterics observability)
llm = ChatOpenAI(
model="gpt-4",
base_url="https://api.demeterics.com/openai/v1",
api_key="dmt_your_key"
)
# Tags go in your prompt - stripped before LLM, zero cost
response = llm.invoke("""/// APP langchain-app
/// FLOW rag-pipeline
Hello!""")
Works With All Major LLM Providers
One proxy. One dashboard. All your AI.
OpenAI
Anthropic
Groq
OpenRouter
Integrations
Works With Your Existing Stack
LangChain
Vercel AI
Zapier
Make
n8n
Any HTTP
Start Free. Scale When Ready.
100 free credits to start. BYOK: pay providers directly + 10% platform fee.
Ready to See What's Happening in Your LLM Stack?
Change one URL. Get instant visibility. Debug faster. Spend smarter.
100 free credits. BYOK supported. No credit card required.