Byoky Docs
Everything you need to integrate Byoky into your app — from quickstart to API reference.
Getting Started
SDK Reference
App Ecosystem
Overview
Byoky lets users store their AI API keys in an encrypted wallet. Your app never sees the keys — it gets a proxied session that routes requests through the wallet.
How it works
Your App → SDK (createFetch) → Content Script → Extension → LLM API
↑
Keys stay here. Always.Two lines changed. Full API compatibility. Streaming, file uploads, and vision all work. Sessions auto-reconnect if the extension restarts.
Installation
Install the SDK
npm install @byoky/sdkScaffold a new project
npx create-byoky-app my-app
# Choose a template:
# 1. AI Chat (Next.js)
# 2. Multi-Provider (Vite)
# 3. Backend Relay (Express)User wallets
Your users need one of these installed:
- Chrome Extension
- Firefox Extension
- iOS App (wallet + Safari extension)
- Android App (pair via QR or relay)
Quickstart
Connect and make your first request in under a minute:
import Anthropic from '@anthropic-ai/sdk';
import { Byoky } from '@byoky/sdk';
const byoky = new Byoky();
const session = await byoky.connect({
providers: [{ id: 'anthropic', required: true }],
modal: true, // shows built-in connect UI with QR code
});
// Use the native Anthropic SDK — just swap in Byoky's fetch
const client = new Anthropic({
apiKey: session.sessionKey,
fetch: session.createFetch('anthropic'),
});
const message = await client.messages.create({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});That's it. Full API compatibility — streaming, file uploads, and vision all work unchanged.
Byoky Client
Constructor
import { Byoky } from '@byoky/sdk';
const byoky = new Byoky({
timeout: 60000, // connection timeout (ms)
relayUrl: 'wss://relay.byoky.com', // relay server for mobile pairing
});byoky.connect(options)
Connect to a Byoky wallet. Returns a ByokySession.
const session = await byoky.connect({
// Which providers your app needs
providers: [
{ id: 'anthropic', required: true },
{ id: 'openai', required: false },
],
// Show built-in modal with extension detection + QR code fallback
modal: true,
// Or handle pairing yourself
onPairingReady: (code) => showQR(code),
// Skip extension, go straight to relay (mobile)
useRelay: true,
});providersProviderRequirement[]required: true means connection fails if the user doesn't have that provider.modalboolean | ModalOptionsonPairingReady(code: string) => voiduseRelaybooleanbyoky.tryReconnect()
Silently reconnect to an existing session. Checks persisted vault sessions, extension live sessions, and stored extension sessions in order. Returns null if nothing is restorable.
const session = await byoky.tryReconnect();
if (session) {
// Restored — ready to make requests
}byoky.connectViaVault(options)
Connect via a Byoky Vault server. Works in both browser and Node.js environments.
const session = await byoky.connectViaVault({
vaultUrl: 'https://vault.byoky.com',
username: 'user@example.com',
password: 'password',
providers: [{ id: 'anthropic' }],
appOrigin: 'https://myapp.com', // required in Node.js
});Utilities
import { isExtensionInstalled, getStoreUrl } from '@byoky/sdk';
// Check if the Byoky extension is installed
if (isExtensionInstalled()) { ... }
// Get the store URL for the user's browser
const url = getStoreUrl(); // Chrome Web Store, Firefox Add-ons, etc.Session API
A ByokySession is returned by connect(), tryReconnect(), or connectViaVault(). It provides everything you need to make API calls through the wallet.
session.createFetch(providerId)
Returns a fetch function that proxies requests through the wallet for the given provider. Use it as a drop-in replacement with any provider SDK.
// Anthropic
const client = new Anthropic({
apiKey: session.sessionKey,
fetch: session.createFetch('anthropic'),
});
// OpenAI
const client = new OpenAI({
apiKey: session.sessionKey,
fetch: session.createFetch('openai'),
});
// Or raw fetch
const fetch = session.createFetch('anthropic');
const res = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: { 'content-type': 'application/json', 'anthropic-version': '2023-06-01' },
body: JSON.stringify({ model: 'claude-sonnet-4-20250514', max_tokens: 1024, messages: [...] }),
});session.createRelay(wsUrl)
Open a WebSocket relay channel so a backend server can make LLM calls through this session. See Backend Relay.
session.disconnect()
Disconnect the session. The wallet revokes all access.
session.isConnected()
Returns true if the session is still valid.
session.getUsage()
Get token usage stats for this session.
interface SessionUsage {
requests: number;
inputTokens: number;
outputTokens: number;
byProvider: Record<string, {
requests: number;
inputTokens: number;
outputTokens: number;
}>;
}
const usage = await session.getUsage();
// { requests: 42, inputTokens: 15000, outputTokens: 8000,
// byProvider: { anthropic: { requests: 42, inputTokens: 15000, outputTokens: 8000 } } }session.onDisconnect(callback)
Register a callback for when the user revokes this session from the wallet.
session.onProvidersUpdated(callback)
Register a callback for when provider availability changes — e.g. the user adds a credential, revokes one, or swaps the provider group bound to your app (cross-provider routing). The callback receives the new session.providers record.
Session properties
session.sessionKey // string — use as apiKey in provider SDKs
session.proxyUrl // string — the proxy endpoint URL
session.providers // Record<ProviderId, ProviderStatus>
interface ProviderStatus {
// true: the wallet has a working credential (or gift) for this provider
// and will hit the provider directly.
// false: your app can still call createFetch(id) — the wallet may route
// it through another provider via cross-provider translation.
available: boolean;
// How the credential authenticates upstream.
authMethod: 'api_key' | 'oauth';
// Present and true when the credential came from a redeemed Token Gift.
// The gifter's wallet proxies every request and enforces the token budget.
gift?: boolean;
}Check providers[id].available before assuming direct access. A provider marked available: false may still work if the user has set up cross-provider routing. See Cross-Provider Routing.
Providers
All providers work with createFetch(providerId):
anthropicAnthropic (Claude)openaiOpenAI (GPT)geminiGoogle GeminimistralMistralcohereCoherexaixAI (Grok)deepseekDeepSeekperplexityPerplexitygroqGroqtogetherTogether AIfireworksFireworks AIopenrouterOpenRouterazure_openaiAzure OpenAIStreaming
Every provider's streaming format works unchanged through createFetch. The proxy forwards response chunks over a persistent port — no buffering, no polling, no special flags on your end.
With a provider SDK
The easiest path — the SDK handles SSE parsing for you:
import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: session.sessionKey,
fetch: session.createFetch('anthropic'),
});
const stream = client.messages.stream({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Write a haiku.' }],
});
for await (const event of stream) {
if (event.type === 'content_block_delta'
&& event.delta.type === 'text_delta') {
process.stdout.write(event.delta.text);
}
}With raw fetch
If you prefer to call the HTTP API directly, parse SSE from the returned response.body:
const fetch = session.createFetch('anthropic');
const res = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: {
'content-type': 'application/json',
'anthropic-version': '2023-06-01',
},
body: JSON.stringify({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
stream: true,
messages: [{ role: 'user', content: 'Hello!' }],
}),
});
const reader = res.body!.getReader();
const decoder = new TextDecoder();
let buf = '';
while (true) {
const { done, value } = await reader.read();
if (done) break;
buf += decoder.decode(value, { stream: true });
const lines = buf.split('\n');
buf = lines.pop() || '';
for (const line of lines) {
if (!line.startsWith('data: ')) continue;
const data = line.slice(6);
if (data === '[DONE]') return;
const event = JSON.parse(data);
if (event.type === 'content_block_delta') {
process.stdout.write(event.delta.text);
}
}
}OpenAI-compatible providers (OpenAI, Groq, DeepSeek, xAI, Mistral, Together, Fireworks, Perplexity, OpenRouter) stream choices[0].delta.content in the same SSE envelope. Gemini uses streamGenerateContent.
Tool Use
Tool use (a.k.a. function calling) works unchanged through the proxy. Define tools, let the model call them, execute locally, feed results back — loop until the model stops asking for tools.
Anthropic format
const fetch = session.createFetch('anthropic');
const tools = [{
name: 'get_weather',
description: 'Get current weather for a city',
input_schema: {
type: 'object',
properties: { city: { type: 'string' } },
required: ['city'],
},
}];
const messages: Array<Record<string, unknown>> = [
{ role: 'user', content: "What's the weather in Tokyo?" },
];
for (let round = 0; round < 5; round++) {
const res = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: {
'content-type': 'application/json',
'anthropic-version': '2023-06-01',
},
body: JSON.stringify({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
tools,
messages,
}),
});
const data = await res.json();
const toolCalls = data.content.filter((b: any) => b.type === 'tool_use');
if (toolCalls.length === 0) {
console.log(data.content.find((b: any) => b.type === 'text')?.text);
break;
}
const results = toolCalls.map((tc: any) => ({
type: 'tool_result',
tool_use_id: tc.id,
content: JSON.stringify(runTool(tc.name, tc.input)),
}));
messages.push({ role: 'assistant', content: data.content });
messages.push({ role: 'user', content: results });
}OpenAI-compatible format
Used by OpenAI, Groq, DeepSeek, xAI, Mistral, Together, Fireworks, Perplexity, and OpenRouter. Tools are wrapped in { type: 'function', function: { ... } }, and the model returns choices[0].message.tool_calls:
const fetch = session.createFetch('openai');
const tools = [{
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather for a city',
parameters: {
type: 'object',
properties: { city: { type: 'string' } },
required: ['city'],
},
},
}];
const messages: Array<Record<string, unknown>> = [
{ role: 'user', content: "What's the weather in Tokyo?" },
];
for (let round = 0; round < 5; round++) {
const res = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: { 'content-type': 'application/json' },
body: JSON.stringify({ model: 'gpt-4o', tools, messages }),
});
const data = await res.json();
const msg = data.choices[0].message;
if (!msg.tool_calls?.length) { console.log(msg.content); break; }
messages.push(msg);
for (const tc of msg.tool_calls) {
const args = JSON.parse(tc.function.arguments);
messages.push({
role: 'tool',
tool_call_id: tc.id,
content: JSON.stringify(runTool(tc.function.name, args)),
});
}
}Structured Output
Get typed JSON back from any OpenAI-compatible provider, plus Anthropic. Two modes exist: OpenAI's strict json_schema (enforced by the model), and the looser json_object mode supported by most OpenAI-compatible providers.
OpenAI strict schema
const fetch = session.createFetch('openai');
const res = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: { 'content-type': 'application/json' },
body: JSON.stringify({
model: 'gpt-4o',
messages: [{ role: 'user', content: 'Extract: "Jane, jane@acme.co, Acme"' }],
response_format: {
type: 'json_schema',
json_schema: {
name: 'contact',
strict: true,
schema: {
type: 'object',
properties: {
name: { type: 'string' },
email: { type: 'string' },
company: { type: 'string' },
},
required: ['name', 'email', 'company'],
additionalProperties: false,
},
},
},
}),
});
const data = await res.json();
const contact = JSON.parse(data.choices[0].message.content);json_object (Groq, DeepSeek, Mistral, Together, Fireworks, OpenRouter, xAI)
body: JSON.stringify({
model: 'llama-3.3-70b-versatile',
messages: [{ role: 'user', content: 'Return JSON with keys name, email.' }],
response_format: { type: 'json_object' },
});Anthropic
Claude doesn't have a response_format field. Prompt it to return JSON and parse the text block — or use tool use with a single tool as the forced schema:
const res = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: {
'content-type': 'application/json',
'anthropic-version': '2023-06-01',
},
body: JSON.stringify({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
messages: [{
role: 'user',
content: 'Return ONLY JSON: { "name": "...", "email": "..." } for: "Jane, jane@acme.co"',
}],
}),
});
const data = await res.json();
const json = JSON.parse(data.content[0].text.match(/\{[\s\S]*\}/)![0]);Vision
Image inputs work through the proxy just like text. Anthropic, OpenAI, and Gemini each take a different wire format — the payload pattern below matches what ships in the demo.
Convert a File to base64
async function fileToBase64(file: File): Promise<string> {
const buffer = await file.arrayBuffer();
const bytes = new Uint8Array(buffer);
let binary = '';
for (let i = 0; i < bytes.length; i++) binary += String.fromCharCode(bytes[i]);
return btoa(binary);
}Anthropic
body: JSON.stringify({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
messages: [{
role: 'user',
content: [
{
type: 'image',
source: { type: 'base64', media_type: file.type, data: base64 },
},
{ type: 'text', text: 'What is in this image?' },
],
}],
});OpenAI
body: JSON.stringify({
model: 'gpt-4o',
messages: [{
role: 'user',
content: [
{
type: 'image_url',
image_url: { url: `data:${file.type};base64,${base64}` },
},
{ type: 'text', text: 'What is in this image?' },
],
}],
});Gemini
body: JSON.stringify({
contents: [{
role: 'user',
parts: [
{ inline_data: { mime_type: file.type, data: base64 } },
{ text: 'What is in this image?' },
],
}],
});Error Handling
Errors from upstream providers surface with their original HTTP status and body — so response.status and the usual { error: { message } } body shape work the same as hitting the provider directly.
The proxy layer adds its own error codes on top, signalled with an HTTP status and an error.code field in the JSON body:
WALLET_NOT_INSTALLEDExtension/app not detected during connect()USER_REJECTEDUser dismissed the connect modalPROVIDER_UNAVAILABLENo credential and no routing group for this providerSESSION_EXPIREDSession was revoked or timed out — call connect() againRATE_LIMITEDUpstream provider rate limit (HTTP 429)QUOTA_EXCEEDEDGift budget or wallet-imposed limit hit (HTTP 429)INVALID_KEYStored credential rejected by providerTOKEN_EXPIREDOAuth token expired and refresh failedPROXY_ERRORGeneric proxy failure — retryableRELAY_CONNECTION_FAILEDBackend relay could not reach the browserRELAY_DISCONNECTEDRelay peer disconnected mid-requestHandling quota errors
When a user redeems a Token Gift with a limited budget, or the wallet enforces per-session limits, requests fail with HTTP 429 and code: 'QUOTA_EXCEEDED'. Surface this to the user rather than retrying:
const fetch = session.createFetch('anthropic');
const res = await fetch(url, { method: 'POST', headers, body });
if (!res.ok) {
const body = await res.json().catch(() => null);
const code = body?.error?.code;
if (res.status === 429 && code === 'QUOTA_EXCEEDED') {
showQuotaExhaustedUI();
return;
}
if (code === 'SESSION_EXPIRED') {
await byoky.connect({ providers: [...], modal: true });
return;
}
throw new Error(body?.error?.message ?? `HTTP ${res.status}`);
}Listening for session lifecycle
session.onDisconnect(() => {
// The user revoked access from the wallet, or the session expired.
// Prompt them to reconnect before the next request.
showReconnectBanner();
});
session.onProvidersUpdated((providers) => {
// A credential was added/removed, or the user changed routing.
// Refresh your UI's model picker.
setAvailable(Object.entries(providers)
.filter(([, v]) => v.available)
.map(([id]) => id));
});Backend Relay
Need LLM calls from your server? The user's browser relays requests through the extension — your backend never sees the API key.
Backend ←WebSocket→ User's Frontend ←Extension→ LLM APIFrontend
import { Byoky } from '@byoky/sdk';
const session = await new Byoky().connect({
providers: [{ id: 'anthropic' }],
modal: true,
});
// Open relay so your backend can make calls through this session
const relay = session.createRelay('wss://your-app.com/ws/relay');Backend (Node.js)
import { ByokyServer } from '@byoky/sdk/server';
const byoky = new ByokyServer();
wss.on('connection', async (ws) => {
const client = await byoky.handleConnection(ws);
const fetch = client.createFetch('anthropic');
const res = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: {
'content-type': 'application/json',
'anthropic-version': '2023-06-01',
},
body: JSON.stringify({
model: 'claude-sonnet-4-20250514',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
}),
});
});Bridge (CLI / Desktop)
CLI tools and desktop apps route API calls through the bridge — a local HTTP proxy that relays requests to the extension via native messaging.
CLI App → HTTP → Bridge (localhost:19280) → Native Messaging → Extension → LLM APISetup
npm install -g @byoky/bridge
byoky-bridge install # register native messaging hostUsage
Once installed, the bridge starts automatically when the extension needs it. CLI tools (like OpenClaw) make HTTP requests to http://127.0.0.1:19280/{provider}/, which the bridge forwards to the extension.
Token Gifts
Share token access without sharing your API key. The sender's wallet proxies all requests — the key never leaves the extension.
Sender's Extension ←WebSocket→ Relay Server ←WebSocket→ Recipient's ExtensionCreate a gift
- Open the wallet → select a credential → click "Gift"
- Set a token budget and expiry
- Share the generated gift link
Redeem a gift
- Open the wallet → click "Redeem Gift"
- Paste the gift link → accept
Self-host the relay
npm install -g @byoky/relay
byoky-relay # default port 8787The recipient never receives your API key. Every request is relayed through the sender's running extension, which enforces the token budget and can revoke access at any time.
Token Pool
The Token Pool is a public board where users share free token gifts with the community.
How it works
- Create a gift in your wallet (extension or mobile)
- Check "List on Token Pool"
- Add a display name (or stay anonymous)
- Your gift appears on the token pool for anyone to redeem
What users see
- Online/offline status — green dot if the gifter's wallet is online (gift is usable), red if offline
- Tokens remaining — progress bar showing how much budget is left
- Expiry countdown — time until the gift expires
- Provider — which LLM provider the tokens are for
API endpoints
The marketplace runs at marketplace.byoky.com with these endpoints:
GET /gifts — list active + expired gifts
GET /gifts/:id/redeem — get gift link for redemption
POST /gifts — list a gift publicly (called by wallet)
DELETE /gifts/:id — unlist a gift
PATCH /gifts/:id/usage — update token usage
POST /gifts/:id/heartbeat — online status pingCross-Provider Routing
Users can route your app's requests through a different provider than what your code targets. For example, your app calls anthropic but the user routes it through openai — the wallet transparently translates request/response bodies and SSE streams.
Your App (Anthropic SDK) → Wallet (translates) → OpenAI API
↕
Anthropic ↔ OpenAI ↔ Gemini ↔ CohereHow it works
- User creates groups in their wallet (e.g. "Claude", "GPT")
- Each group is pinned to a specific credential and provider
- Dragging an app between groups reroutes its traffic
- Request bodies, response bodies, and SSE streams are translated on the fly
No code changes required. Your app keeps calling its preferred SDK; the wallet handles the translation. Live sessions reroute automatically.
App Ecosystem
Build apps that users install directly into their Byoky wallet. Your app runs inside a sandboxed iframe (extension) or WebView (mobile) — full isolation from the wallet's keys and storage.
How marketplace apps work
- You build a web app that uses
@byoky/sdk - You host it on your own infrastructure (HTTPS required)
- You submit it to the marketplace for review
- Once approved, users can install it from the App Store inside their wallet
- Your app runs in a sandboxed environment — keys never touch your code
Security model
- Apps run in sandboxed iframes (
allow-scripts allow-forms) or native WebViews - Cross-origin isolation prevents access to wallet storage, DOM, or keys
- All communication happens via the SDK's
postMessagebridge - Installing an app auto-trusts its origin for the declared providers
- Users can disable or uninstall apps at any time
Hosting requirements
Because your app loads inside an iframe in the Byoky extension, your server must allow iframe embedding. Do not set X-Frame-Options: DENY or SAMEORIGIN, and either omit Content-Security-Policy frame-ancestors or set it to something permissive:
Content-Security-Policy: frame-ancestors *We verify this automatically at submission time and reject apps that would fail to load.
App Manifest
Every marketplace app needs a byoky.app.json manifest in the project root. Run npx create-byoky-app init to generate one interactively.
{
"name": "TradeBot Pro",
"slug": "tradebot-pro",
"url": "https://tradebot.acme-ai.com",
"icon": "/icon.png",
"description": "AI-powered trading signals using your own API keys",
"category": "trading",
"providers": ["anthropic", "openai"],
"author": {
"name": "Acme AI",
"email": "dev@acme-ai.com",
"website": "https://acme-ai.com"
}
}Fields
namestringslugstringurlstringiconstringdescriptionstringcategorystringchat, coding, trading, productivity, research, creative, other.providersstring[]["anthropic", "openai"]). Users approve which providers to grant on install.authorobjectname (required), email (required), website (optional).Review criteria
- App loads over HTTPS
- App URL allows iframe embedding (no
X-Frame-Options: DENY/SAMEORIGIN, no restrictiveframe-ancestors) - Uses
@byoky/sdkfor all LLM access - Only requests providers it actually uses
- No obfuscated JavaScript
- Privacy policy exists