Back to Documentation
Integrations
Connect RANA with your favorite tools and services. Hugging Face, Supabase, Weights & Biases, Sentry, AWS Bedrock, and MCP support.
npm install @rana/integrations
Hugging Face
Use any Hugging Face model with RANA
import { HuggingFaceProvider } from '@rana/integrations/huggingface';
const hf = new HuggingFaceProvider({
apiKey: process.env.HF_API_KEY,
defaultModel: 'mistralai/Mistral-7B-Instruct-v0.2'
});
// Use as a provider
const result = await hf.chat({
messages: [{ role: 'user', content: 'Hello!' }]
});
// Use specific models
const embedding = await hf.embed(
'Your text here',
{ model: 'sentence-transformers/all-MiniLM-L6-v2' }
);
// Use in agents
const agent = new Agent({
provider: hf,
model: 'meta-llama/Llama-2-70b-chat-hf'
});Supabase
Vector storage and RAG with Supabase
import { SupabaseVectorStore } from '@rana/integrations/supabase';
const vectorStore = new SupabaseVectorStore({
supabaseUrl: process.env.SUPABASE_URL,
supabaseKey: process.env.SUPABASE_KEY,
tableName: 'documents',
embeddingColumn: 'embedding'
});
// Store vectors
await vectorStore.upsert([
{
id: 'doc-1',
content: 'Your document text',
metadata: { source: 'manual' }
}
]);
// Similarity search
const results = await vectorStore.search(
'search query',
{ limit: 5, threshold: 0.7 }
);
// Use with RAG
const rag = new RAGPipeline({
vectorStore,
retriever: { topK: 5 }
});Weights & Biases
Experiment tracking and model monitoring
import { WandBIntegration } from '@rana/integrations/wandb';
const wandb = new WandBIntegration({
apiKey: process.env.WANDB_API_KEY,
project: 'my-ai-app',
entity: 'my-team'
});
// Log experiments
await wandb.logRun({
name: 'prompt-v2',
config: {
model: 'gpt-4',
temperature: 0.7,
prompt_version: 'v2'
}
});
// Log metrics
await wandb.logMetrics({
accuracy: 0.95,
latency: 1200,
cost: 0.05
});
// Log artifacts
await wandb.logArtifact({
name: 'prompts',
type: 'dataset',
data: promptTemplates
});
// Compare runs
const comparison = await wandb.compareRuns(['run-1', 'run-2']);Sentry
Error tracking and performance monitoring
import { SentryIntegration } from '@rana/integrations/sentry';
const sentry = new SentryIntegration({
dsn: process.env.SENTRY_DSN,
environment: 'production',
tracesSampleRate: 1.0
});
// Automatic error capture
sentry.captureException(error, {
tags: { model: 'gpt-4', feature: 'chat' },
extra: { prompt: userMessage }
});
// Transaction tracing
const transaction = sentry.startTransaction({
name: 'AI Request',
op: 'ai.chat'
});
const span = transaction.startChild({
op: 'ai.completion',
description: 'OpenAI completion'
});
// ... do work ...
span.finish();
transaction.finish();
// Use as middleware
agent.use(sentry.middleware());AWS Bedrock
Use AWS Bedrock models with RANA
import { BedrockProvider } from '@rana/integrations/aws-bedrock';
const bedrock = new BedrockProvider({
region: 'us-east-1',
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
});
// Use Claude on Bedrock
const result = await bedrock.chat({
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
messages: [{ role: 'user', content: 'Hello!' }]
});
// Use Titan embeddings
const embedding = await bedrock.embed(
'Your text here',
{ model: 'amazon.titan-embed-text-v1' }
);
// Streaming
const stream = bedrock.stream({
model: 'anthropic.claude-3-haiku-20240307-v1:0',
messages: [{ role: 'user', content: 'Tell me a story' }]
});
for await (const chunk of stream) {
process.stdout.write(chunk.content);
}MCP (Model Context Protocol)
Connect to any MCP-compatible tool server
import { MCPClient, MCPServer } from '@rana/integrations/mcp';
// Connect to MCP server
const client = new MCPClient({
serverUrl: 'http://localhost:3001',
capabilities: ['tools', 'resources']
});
// List available tools
const tools = await client.listTools();
// Call a tool
const result = await client.callTool('search', {
query: 'latest news'
});
// Create your own MCP server
const server = new MCPServer({
name: 'my-tools',
version: '1.0.0'
});
server.addTool({
name: 'calculate',
description: 'Perform calculations',
inputSchema: { expression: { type: 'string' } },
handler: async ({ expression }) => {
return eval(expression);
}
});
server.addResource({
name: 'config',
description: 'Application configuration',
handler: async () => {
return { theme: 'dark', language: 'en' };
}
});
await server.start(3001);All Available Integrations
Hugging Face
Supabase
Weights & Biases
Sentry
AWS Bedrock
Azure OpenAI
Google Vertex
MCP Protocol
LangSmith
Datadog
New Relic
Grafana