Appearance
Introduction
Rikaii sits between your applications and LLM providers. You call OpenAI-compatible HTTP endpoints; we handle routing, metering, keys (including BYOK), and billing according to your workspace settings.
Who this documentation is for
These docs target server-side integrations—backend services, scripts, and automation. Do not embed API keys in browsers or untrusted clients.
Base URL
Production API host:
https://api.rikaii.com
The primary inference endpoint is POST /v1/chat/completions.
Next steps
- Quickstart — create an API key and send your first request.
- Authentication — required headers and billing context.
- Chat completions — request and response reference.