OpenAI SDK
Use the official OpenAI SDK with AllToken.
TypeScript
Install
$ npm install openaiTypeScript
| 1 | import OpenAI from 'openai'; |
| 2 | |
| 3 | const client = new OpenAI({ |
| 4 | apiKey: process.env.ALLTOKEN_API_KEY, |
| 5 | baseURL: 'https://api.alltoken.ai/v1', |
| 6 | }); |
| 7 | |
| 8 | const completion = await client.chat.completions.create({ |
| 9 | model: 'deepseek-chat', |
| 10 | messages: [{ role: 'user', content: 'Hello!' }], |
| 11 | }); |
| 12 | |
| 13 | console.log(completion.choices[0].message.content); |
Python
Install
$ pip install openaiPython
| 1 | from openai import OpenAI |
| 2 | import os |
| 3 | |
| 4 | client = OpenAI( |
| 5 | api_key=os.environ["ALLTOKEN_API_KEY"], |
| 6 | base_url="https://api.alltoken.ai/v1", |
| 7 | ) |
| 8 | |
| 9 | completion = client.chat.completions.create( |
| 10 | model="deepseek-chat", |
| 11 | messages=[{"role": "user", "content": "Hello!"}], |
| 12 | ) |
| 13 | |
| 14 | print(completion.choices[0].message.content) |
Streaming with the SDK
Both TypeScript and Python SDKs support streaming natively:
TypeScript streaming
| 1 | const stream = await client.chat.completions.create({ |
| 2 | model: 'deepseek-chat', |
| 3 | messages: [{ role: 'user', content: 'Tell me a story' }], |
| 4 | stream: true, |
| 5 | }); |
| 6 | |
| 7 | for await (const chunk of stream) { |
| 8 | process.stdout.write(chunk.choices[0]?.delta?.content ?? ''); |