Anthropic
Coming soonUse the Anthropic Messages API through Rhone. Drop-in compatible with the official Anthropic SDK — add session continuity, hosted history, and smart routing with minimal code changes.
- ✓Drop-in Anthropic SDK compatibility
- ✓Session-backed Messages API
- ✓Hosted history and continuity
- ✓Smart routing across Anthropic models
OpenAI
Coming soonUse the OpenAI Responses API through Rhone. Drop-in compatible with the official OpenAI SDK — add session continuity, hosted history, and multi-provider routing.
- ✓Drop-in OpenAI SDK compatibility
- ✓Session-backed Responses API
- ✓Hosted history and continuity
- ✓Cross-provider routing via Rhone
When to use Provider Compat SDKs
Provider Compat SDKs are for teams that already use Anthropic or OpenAI request and response shapes and want gateway continuity, routing, and hosted state without moving fully onto the native VAI surface.
Anthropic, OpenAI Chat, and OpenAI Responses continuity use the top-level rhone request object, not transport-only session headers. The vai-anthropic, vai-openai-chat, and vai-openai-responses packages add Node-only websocket clients for stateful provider-compatible execution over rhone.ws.v1.
If you are starting fresh or want the full VAI surface for runs, client tools, native realtime, live audio, data layer, and evals, use the VAI SDK.
Feature Comparison
| Session continuity | Full | Session-backed call continuity |
|---|---|---|
| Calls and runs | Full | Messages/Responses only |
| Streaming (SSE) | Full | Provider-native |
| Stateful websocket mode | Native VAI websocket | Anthropic, OpenAI Chat, and OpenAI Responses wrappers |
| Client tools | Typed + auto-resume | Provider-native tool loops |
| Live audio | Yes | No |
| Hosted history | Full | Session-backed write path |
| Data layer queries | Full | No |
| Evals and assessments | Full | No |
| Smart routing | Full | Request-compatible routing |
Anthropic Wrapper
Install and use vai-anthropic when you want Anthropic-compatible HTTP, SSE, and websocket behavior on top of Rhone:
1import Anthropic from "vai-anthropic";
2import WebSocket from "ws";
3
4const client = new Anthropic({
5 apiKey: process.env.VAI_API_KEY,
6 baseURL: "https://api.rhone.dev",
7 webSocket: WebSocket,
8});
9
10const message = await client.messages.create({
11 model: "claude-sonnet-4-6",
12 max_tokens: 512,
13 messages: [{ role: "user", content: "Summarize this repository." }],
14});
15
16const connection = await client.messages.connect();
17await connection.bindSession({ session_id: message.rhone?.session_id });
OpenAI Chat Wrapper
Install and use vai-openai-chat when you want OpenAI Chat Completions-compatible HTTP, SSE, and websocket behavior on top of Rhone:
1import OpenAI from "vai-openai-chat";
2import WebSocket from "ws";
3
4const client = new OpenAI({
5 apiKey: process.env.VAI_API_KEY,
6 baseURL: "https://api.rhone.dev",
7 webSocket: WebSocket,
8});
9
10const completion = await client.chat.completions.create({
11 model: "gpt-5.4-mini",
12 messages: [{ role: "user", content: "Summarize this repository." }],
13});
14
15const connection = await client.chat.completions.connect();
16await connection.bindSession({ session_id: completion.rhone?.session_id });
OpenAI Responses Wrapper
Install and use vai-openai-responses when you want OpenAI Responses-compatible HTTP, SSE, and websocket behavior on top of Rhone:
1import OpenAI from "vai-openai-responses";
2import WebSocket from "ws";
3
4const client = new OpenAI({
5 apiKey: process.env.VAI_API_KEY,
6 baseURL: "https://api.rhone.dev",
7 webSocket: WebSocket,
8});
9
10const response = await client.responses.create({
11 model: "gpt-5.4-mini",
12 input: "Summarize this repository.",
13});
14
15const connection = await client.responses.connect();
16await connection.bindSession({ session_id: response.rhone?.session_id });