Quickstart

Get started with AllToken.

Introduction

AllToken provides a unified API with access to 300+ AI models through a single endpoint, with automatic fallbacks and cost-effective routing built in.

Get started in minutes with your preferred SDK or HTTP client.

Base URL: https://api.alltoken.ai/v1

Auth: Bearer API key

Compatibility: OpenAI-compatible API

Get your API key

Before getting started, create an API key:

  1. Go to Settings → API Keys
  2. Click Create new key
  3. Copy and save the key securely — it's shown only once

Keep your API key secret. Do not expose it in client-side code or public repositories.

Install the SDK

Use the OpenAI SDK with AllToken. Install it with your preferred package manager:

npm
$ npm install openai

Then set your environment variable:

Shell
$ export ALLTOKEN_API_KEY="your_alltoken_api_key"

Send your first request

Create a client, pick a model, and send a chat completion:

TypeScript
1import OpenAI from 'openai';
2
3const client = new OpenAI({
4 apiKey: process.env.ALLTOKEN_API_KEY,
5 baseURL: 'https://api.alltoken.ai/v1',
6});
7
8const completion = await client.chat.completions.create({
9 model: 'deepseek-chat',
10 messages: [
11 {
12 role: 'user',
13 content: 'What is the meaning of life?',
14 },
15 ],
16});
17
18console.log(completion.choices[0]?.message?.content);

Python example

Python
1from openai import OpenAI
2import os
3
4client = OpenAI(
5 api_key=os.environ.get("ALLTOKEN_API_KEY"),
6 base_url="https://api.alltoken.ai/v1",
7)
8
9completion = client.chat.completions.create(
10 model="deepseek-chat",
11 messages=[
12 {"role": "user", "content": "What is the meaning of life?"}
13 ],
14)
15
16print(completion.choices[0].message.content)

Using the API directly

Call the API directly with cURL or any HTTP client:

cURL
1curl https://api.alltoken.ai/v1/chat/completions \
2 -H "Authorization: Bearer $ALLTOKEN_API_KEY" \
3 -H "Content-Type: application/json" \
4 -d '{
5 "model": "deepseek-chat",
6 "messages": [
7 {"role": "user", "content": "Hello!"}
8 ]
9 }'

Streaming responses

Add stream: true to get responses token-by-token via Server-Sent Events:

TypeScript
1const stream = await client.chat.completions.create({
2 model: 'deepseek-chat',
3 messages: [{ role: 'user', content: 'Tell me a story' }],
4 stream: true,
5});
6
7for await (const chunk of stream) {
8 const content = chunk.choices[0]?.delta?.content;
9 if (content) process.stdout.write(content);
10}

For detailed streaming documentation, see Streaming.

Next steps