Quick Start

Quick Start

Learn how to integrate Demeterics into your workflows with step-by-step guides and API examples.

Quick Start

Demeterics offers two “2-minute” entry points—embed the AI chat widget for front-of-house conversations or route your API traffic through the observability proxy for back-of-house visibility. This guide shows both paths so marketing, product, finance, and engineering teams can adopt the platform using the same trust layer described in docs/PROBLEMS.md.


1. Create Your Keys

  1. Sign in at demeterics.com.
  2. Navigate to Settings → API Keys.
  3. Click Create API Key and label it (e.g., chat_widget_prod or zapier_proxy).
  4. Copy the key (prefix dmt_) and store it securely. You will not be able to reveal it again.

You can create multiple keys for automation tools versus human admins. Keys are scoped and revocable.


2. Choose Your Starting Line

Demeterics’ manifesto talks about two immediate wins:

  • Citizen Developer / Accidental AI Owner → Ship the AI chat widget and capture conversations.
  • Paranoid CTO / Data Scientist / CFO → Instrument every API call through the observability proxy.

You can deploy both, but start with the one that removes tonight’s headache.


Option A — Observability Proxy (API / Automations)

Demeterics is OpenAI-compatible. Change the base URL and send your Demeterics key (or dual key) in the Authorization header.

CLI Example

curl -X POST https://api.demeterics.com/groq/v1/chat/completions \
  -H "Authorization: Dual dem_live_admin_abc:user_live_agent_xyz" \
  -H "Content-Type: application/json" \
  -H "x-dem-app: marketing" \
  -H "x-dem-flow: onboarding" \
  -d '{
    "model": "your-model-name",
    "messages": [
      {"role": "user", "content": "What are three interesting facts about quantum computing?"}
    ]
  }'

Result: The interaction appears instantly in /interactions with prompt, response, latency, cost, and metadata. It’s also streamed into BigQuery for dashboards, alerts, and compliance exports.

SDK Examples

  • Python
from openai import OpenAI

client = OpenAI(
    base_url="https://api.demeterics.com/groq/v1",
    api_key="dmt_your_api_key_here"
)

response = client.chat.completions.create(
    model="your-model-name",
    messages=[{"role": "user", "content": "Explain machine learning in simple terms"}],
    extra_headers={"x-dem-flow": "support", "x-dem-variant": "v2"}
)

print(response.choices[0].message.content)
  • Node.js
import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'https://api.demeterics.com/groq/v1',
  apiKey: process.env.DEM_API_KEY,
  defaultHeaders: {
    'x-dem-app': 'ops',
    'x-dem-flow': 'refunds'
  }
});

const response = await client.chat.completions.create({
  model: 'your-model-name',
  messages: [{ role: 'user', content: 'What is the best way to learn programming?' }]
});
  • Go
client := openai.NewClient(
    option.WithBaseURL("https://api.demeterics.com/groq/v1"),
    option.WithAPIKey(os.Getenv("DEM_API_KEY")),
)

resp, err := client.Chat.Completions.New(context.Background(), openai.ChatCompletionNewParams{
    Model: openai.F("your-model-name"),
    Messages: openai.F([]openai.ChatCompletionMessageParamUnion{
        openai.UserMessage("What is Docker?"),
    }),
    Metadata: openai.F(map[string]any{
        "flow": "onboarding",
        "variant": "control",
    }),
})
if err != nil {
    log.Fatal(err)
}
fmt.Println(resp.Choices[0].Message.Content)

Provider URLs

Provider Base URL
Groq https://api.demeterics.com/groq/v1
OpenAI https://api.demeterics.com/openai/v1
Anthropic https://api.demeterics.com/anthropic/v1
Gemini https://api.demeterics.com/gemini/v1

Model availability changes frequently—check your vendor dashboard or make it configurable.


Option B — AI Chat Widget (Website / Customer-Facing)

If you need to give visitors answers now, use the widget path.

  1. In the dashboard, go to AI Chat → Create Agent.
  2. Configure name, brand color, avatar, rate limits, and allowed domains.
  3. Use “Recreate for my site” to auto-generate the default prompt from your About/Products page, then edit the tone and guardrails.
  4. Paste the embed snippet before </body> on any site:
<script src="https://demeterics.com/widget/embed.js"
        data-key="dem_live_widget_xyz;openai_live_xyz"
        async></script>
<dem-agent title="Ask us anything"
        theme="light"
        avatar="friendly"></dem-agent>
  1. Watch live transcripts arrive inside the dashboard. Every conversation includes prompt, answer, cost, and metadata for compliance and ROI reporting.

See the AI Chat Widget Guide for prompt strategies, routing rules, and advanced customization.


Template Validation & Quality Gates {#template-validation}

The manifesto highlights how template/struct mismatches cause runtime errors. Before shipping a new handler or template:

  1. Run go test ./internal/web/... or make test locally.
  2. Use internal/web/template_validator.go to compare template fields with Go structs.
  3. Spot-check /interactions to ensure metadata, prompts, and responses render correctly.
  4. For chat widget prompts, store instructions in version control so you can trace changes.

Failing these checks leads to the “paranoid CTO” nightmares described in docs/PROBLEMS.md.


Bring Your Own Key (BYOK)

You can route traffic through Demeterics but pay providers directly:

  1. Navigate to Settings → Provider Keys.
  2. Add your Groq/OpenAI/Anthropic/Gemini keys.
  3. Make calls normally. Demeterics uses your vendor billing but still records every interaction.

Benefits

  • No double billing—use your existing vendor contract.
  • Keep the same audit logs, analytics, and alerts.
  • Switch between credits and BYOK per workflow.

What Happens Next?

  1. View /interactions – Filter by model, user, metadata, or cost.
  2. Invite teammates – Marketing sees chat transcripts; engineering gets API replay; finance monitors credits.
  3. Explore docsAI chat widget guide, API reference, SDKs.
  4. Plan your roadmap – Immediate relief now; prompt versioning, budgets, compliance reports, and ROI dashboards next.

Demeterics is the adapter that finally turns AI chaos into measurable, governable output. Pick your starting line and flip on the lights.


Next Steps


Common Questions

What's the difference between my Demeterics API key and a vendor API key?

  • Demeterics API key (dmt_...): Used for all API calls. Demeterics handles vendor authentication, billing, and tracking.
  • Vendor API key (Groq/OpenAI/etc.): Optional. Store in Settings → API Keys for BYOK mode (zero billing).

How are credits charged?

Demeterics mirrors the vendor’s per-token pricing and deducts from your credit balance in real time. Exact rates change frequently—check the Pricing page in the dashboard or run BYOK mode to use your existing vendor billing without touching credits.

Can I use streaming?

Yes! All providers support streaming. Add "stream": true to your request:

curl -X POST https://api.demeterics.com/groq/v1/chat/completions \
  -H "Authorization: Bearer dmt_your_api_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "your-model-name",
    "messages": [{"role": "user", "content": "Tell me a story"}],
    "stream": true
  }'

Where is my data stored?

  • Interactions: BigQuery (your GCP project or Demeterics-managed)
  • API Keys: Google Cloud Datastore (encrypted with KMS)
  • Credit Balance: Stripe (PCI-compliant)

All data is encrypted at rest and in transit. See Security for details.

How do I monitor my usage?

  1. Dashboard: Real-time metrics at demeterics.com/dashboard
  2. Interactions: Detailed logs at demeterics.com/interactions
  3. Exports: Download JSON/CSV via /api/v1/exports
  4. BigQuery: Direct SQL queries for advanced analytics

Ready to build? Grab your API key and start tracking LLM interactions today! 🚀