Credits & Pricing

Credits & Pricing

Learn how to integrate Demeterics into your workflows with step-by-step guides and API examples.

Credits & Pricing

Overview

Demeterics uses a credit-based pricing model where you purchase credits in advance and use them to power AI interactions, chat widgets, and LLM observability features.

Key benefits:

  • Simple, transparent pricing
  • No surprise bills
  • Credits stay active with regular usage
  • Pay only for what you use

Pricing Matrix (Direct vs AI Chat)

Mode BYOK Managed Key BYOW
Direct (API / proxy) Service fee: 10% of LLM cost. Send both your Demeterics API key and vendor LLM key in the Authorization header. You pay the LLM vendor directly; Demeterics bills the 10% service fee in credits. Service fee: 15% of LLM cost. Use only your Demeterics API key (marked as Managed Key). You pay LLM cost + 15% (115% of provider cost). Disabled today. Planned fixed rate: $0.03 / 1M input tokens and $0.12 / 1M output tokens (~20% markup on OSS 120) when enabled.
AI Chat (widget / agent) Service fee: 20% of LLM cost. Configure your vendor LLM key when creating the AI Chat agent; you pay the provider directly and we bill the 20% service fee in credits. Service fee: 25% of LLM cost (default). Use only your Demeterics API key; you pay LLM cost + 25% (125% of provider cost). Fixed pricing: $0.05 / 1M input tokens and $0.18 / 1M output tokens (~30% markup on OSS 120). Requires a Demeterics API key configured with a webhook (e.g., n8n flow or OpenAI Agent Builder).

OpenAI Agent Builder: treat it as AI Chat BYOW. Store your OpenAI key in Demeterics, point the AI Chat agent to your Agent Builder webhook or model endpoint, and billing uses the AI Chat BYOW token rates ($0.05 / 1M input, $0.18 / 1M output).

Credit Expiration

Credits are designed to stay active as long as you're using the platform:

  • Active accounts: Credits never expire if you log in or run traffic at least once every 12 months
  • Inactive accounts: Credits expire after 12 consecutive months of complete inactivity (no logins, no API calls, no widget traffic)
  • Automatic refresh: Any login or API usage resets the 12-month expiration timer

What Can You Do With Credits?

Example: 1 Credit

One credit enables surprisingly powerful AI capabilities. Here's a real-world example:

Generate a 1-page summary of the entire Bible Gospels (480 pages) using OpenAI GPT OSS 20B (AI Chat BYOK at 20% service fee):

  • Input: 83,898 words (~111,864 tokens at 0.75 tokens/word)
  • Output: 1,000 tokens (approximately one letter page)
  • Cost breakdown:
    • Input tokens: 111,864 × $0.075 per 1M tokens = $0.0084
    • Output tokens: 1,000 × $0.30 per 1M tokens = $0.0003
    • Provider Cost: ~$0.0087 (paid to the LLM vendor)
    • Demeterics Fee (20%): ~$0.00174 (paid in credits)
    • Total cost: ~$0.01044 ≈ 1 credit in Demeterics fees

This example demonstrates that 1 credit can process massive amounts of text and generate meaningful summaries.

More Examples

Task Model Approximate Credits
Chat widget conversation (10 messages) GPT-4o mini ~5 credits
Full PDF document analysis (50 pages) Claude Sonnet ~3 credits
Code generation (500 lines) GPT-4 ~8 credits
Evaluation run (100 test cases) Multiple models ~20 credits

Note: Actual credit usage varies based on:

  • Model selected (GPT-4, Claude, Llama, etc.)
  • Input length (prompt + context)
  • Output length (response tokens)
  • Provider pricing (OpenAI, Anthropic, Groq, etc.)
  • Consumption Mode (Direct vs AI Chat; BYOK vs Managed Key vs BYOW)

How Credits Are Used

Credits are consumed when you use Demeterics features. The pricing matrix above applies to both Direct (API/proxy) and AI Chat (widget/agent) usage.

  1. AI Chat Widget

    • Each user message → AI response consumes credits according to AI Chat BYOK, Managed Key, or BYOW pricing.
    • BYOW lets you route to webhooks (e.g., n8n, Make, custom endpoints, or OpenAI Agent Builder) at the fixed token rates.
    • Real-time deduction with dashboard visibility.
  2. LLM Observability Proxy (Direct API)

    • API calls routed through Demeterics proxy.
    • BYOK bills the service fee only (10%); Managed Key bills the LLM cost + 15%.
    • Credits charged based on the selected mode.
  3. Evaluations

    • Each eval run consumes credits per test case.
    • LLM-graded evaluations use additional credits for grading.
    • Billing follows the same mode you select (Direct or AI Chat, BYOK/Managed/BYOW).
  4. Analytics & Exports

    • Dashboard queries: Free (no credits charged)
    • BigQuery exports: Free
    • Webhook deliveries: Free

Credit Accounting

  • 1 credit = $0.01 USD.
  • Direct BYOK: Credits charged = Provider Cost × 10% (provider is paid with your LLM key).
  • AI Chat BYOK: Credits charged = Provider Cost × 20% (provider is paid with your LLM key).
  • Direct Managed Key: Credits charged = Provider Cost × 1.15 (Demeterics pays the provider).
  • AI Chat Managed Key: Credits charged = Provider Cost × 1.25 (Demeterics pays the provider).
  • AI Chat BYOW: Credits charged from fixed token rates: $0.05 per 1M input tokens, $0.18 per 1M output tokens.
  • Direct BYOW: Disabled; planned fixed token rates: $0.03 per 1M input tokens, $0.12 per 1M output tokens.

Purchasing Credits

For Individuals

  1. Log in to your Demeterics dashboard
  2. Navigate to Settings → Credits
  3. Select a credit bundle or enter a custom amount.
  4. Pay via Stripe (credit card, Apple Pay, Google Pay)
  5. Credits appear instantly in your account

For Teams & Enterprises

  • Shared credit pool across team members
  • Centralized billing and invoicing
  • Volume discounts for 50,000+ credits
  • Optional postpaid invoicing (NET 30)
  • Contact sales@demeterics.com for custom plans

Credit Usage Tracking

Monitor your credit usage in real-time:

  • Dashboard: Top navigation shows current balance
  • Interactions page: See per-interaction credit costs
  • Analytics: View credit consumption trends over time
  • Alerts: Get notified when balance drops below threshold

Supported Providers

We support all major LLM providers with transparent pricing:

  • OpenAI: GPT-4, GPT-4o, GPT-4 Turbo, GPT-3.5
  • Anthropic: Claude 3 Opus, Sonnet, Haiku
  • Groq: Llama 3, Mixtral, Gemma
  • Google: Gemini Pro, Gemini Ultra
  • Meta: Llama 2, Llama 3 (via various providers)

Pricing updates automatically when providers change their rates.

Refunds & Credits Policy

  • Unused credits: Refundable within 30 days of purchase (minus any used credits)
  • Service issues: Free credits issued for any platform downtime
  • Disputed charges: Full investigation with credit reimbursement if warranted
  • Account closure: Pro-rated refund of unused credits (minimum $10 balance)

FAQs

Q: What happens if I run out of credits?
A: Your widgets and API calls will stop working until you purchase more credits. You'll receive email alerts at 20%, 10%, and 5% balance thresholds.

Q: Can I transfer credits between accounts?
A: No. Credits are tied to the account that purchased them. For team/enterprise accounts, credits are shared across all team members.

Q: Do you offer free credits for open source projects?
A: Yes! Contact us at opensource@demeterics.com with details about your project. We offer 10,000 free credits for qualifying open source projects.

Q: What if provider prices drop?
A: You automatically benefit! Credit costs are calculated in real-time based on current provider pricing.

Q: Can I get invoices for accounting?
A: Yes. All credit purchases include downloadable invoices via Stripe. Enterprise customers can request custom invoicing formats.

Getting Help


Ready to get started? Sign up now and get 100 free credits to explore Demeterics.