Provider Compat SDKs

Use provider-compatible SDKs and wrappers on top of the Rhone gateway without migrating fully to the native VAI SDK.

VAI SDK

Anthropic

Coming soon

Use the Anthropic Messages API through Rhone. Drop-in compatible with the official Anthropic SDK — add session continuity, hosted history, and smart routing with minimal code changes.

  • Drop-in Anthropic SDK compatibility
  • Session-backed Messages API
  • Hosted history and continuity
  • Smart routing across Anthropic models

OpenAI

Coming soon

Use the OpenAI Responses API through Rhone. Drop-in compatible with the official OpenAI SDK — add session continuity, hosted history, and multi-provider routing.

  • Drop-in OpenAI SDK compatibility
  • Session-backed Responses API
  • Hosted history and continuity
  • Cross-provider routing via Rhone

When to use Provider Compat SDKs

Provider Compat SDKs are for teams that already use Anthropic or OpenAI request and response shapes and want gateway continuity, routing, and hosted state without moving fully onto the native VAI surface.

Anthropic, OpenAI Chat, and OpenAI Responses continuity use the top-level rhone request object, not transport-only session headers. The vai-anthropic, vai-openai-chat, and vai-openai-responses packages add Node-only websocket clients for stateful provider-compatible execution over rhone.ws.v1.

If you are starting fresh or want the full VAI surface for runs, client tools, native realtime, live audio, data layer, and evals, use the VAI SDK.

Feature Comparison

Session continuityFullSession-backed call continuity
Calls and runsFullMessages/Responses only
Streaming (SSE)FullProvider-native
Stateful websocket modeNative VAI websocketAnthropic, OpenAI Chat, and OpenAI Responses wrappers
Client toolsTyped + auto-resumeProvider-native tool loops
Live audioYesNo
Hosted historyFullSession-backed write path
Data layer queriesFullNo
Evals and assessmentsFullNo
Smart routingFullRequest-compatible routing

Anthropic Wrapper

Install and use vai-anthropic when you want Anthropic-compatible HTTP, SSE, and websocket behavior on top of Rhone:

typescript
1import Anthropic from "vai-anthropic";
2import WebSocket from "ws";
3
4const client = new Anthropic({
5  apiKey: process.env.VAI_API_KEY,
6  baseURL: "https://api.rhone.dev",
7  webSocket: WebSocket,
8});
9
10const message = await client.messages.create({
11  model: "claude-sonnet-4-6",
12  max_tokens: 512,
13  messages: [{ role: "user", content: "Summarize this repository." }],
14});
15
16const connection = await client.messages.connect();
17await connection.bindSession({ session_id: message.rhone?.session_id });

OpenAI Chat Wrapper

Install and use vai-openai-chat when you want OpenAI Chat Completions-compatible HTTP, SSE, and websocket behavior on top of Rhone:

typescript
1import OpenAI from "vai-openai-chat";
2import WebSocket from "ws";
3
4const client = new OpenAI({
5  apiKey: process.env.VAI_API_KEY,
6  baseURL: "https://api.rhone.dev",
7  webSocket: WebSocket,
8});
9
10const completion = await client.chat.completions.create({
11  model: "gpt-5.4-mini",
12  messages: [{ role: "user", content: "Summarize this repository." }],
13});
14
15const connection = await client.chat.completions.connect();
16await connection.bindSession({ session_id: completion.rhone?.session_id });

OpenAI Responses Wrapper

Install and use vai-openai-responses when you want OpenAI Responses-compatible HTTP, SSE, and websocket behavior on top of Rhone:

typescript
1import OpenAI from "vai-openai-responses";
2import WebSocket from "ws";
3
4const client = new OpenAI({
5  apiKey: process.env.VAI_API_KEY,
6  baseURL: "https://api.rhone.dev",
7  webSocket: WebSocket,
8});
9
10const response = await client.responses.create({
11  model: "gpt-5.4-mini",
12  input: "Summarize this repository.",
13});
14
15const connection = await client.responses.connect();
16await connection.bindSession({ session_id: response.rhone?.session_id });