Back to Documentation
Migration Guide
Migrate from LangChain, Vercel AI SDK, or OpenAI SDK to RANA. Side-by-side code comparisons to make migration easy.
Why Migrate to RANA?
| Feature | RANA | Others |
|---|---|---|
| Automatic retries & fallbacks | ||
| Built-in cost tracking | ||
| Provider-agnostic API | ||
| Semantic testing | ||
| Memory management | ||
| Prompt versioning | ||
| RAG with citations | ||
| OpenTelemetry support |
From LangChain to RANA
Chat Models
Before (LangChain)
import { ChatOpenAI } from '@langchain/openai';
const chat = new ChatOpenAI({
modelName: 'gpt-4',
temperature: 0.7,
});
const response = await chat.invoke([
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
]);
console.log(response.content);After (RANA)
import { Agent } from '@rana/core';
const agent = new Agent({
model: 'gpt-4',
temperature: 0.7,
systemPrompt: 'You are a helpful assistant.'
});
const response = await agent.run('Hello!');
console.log(response);Chains
Before (LangChain)
import { LLMChain, PromptTemplate } from 'langchain';
const template = 'Translate to {language}: {text}';
const prompt = new PromptTemplate({
template,
inputVariables: ['language', 'text']
});
const chain = new LLMChain({ llm: chat, prompt });
const result = await chain.call({
language: 'Spanish',
text: 'Hello world'
});After (RANA)
import { translate } from '@rana/helpers';
const result = await translate('Hello world', {
to: 'Spanish'
});
// Or with PromptManager for complex prompts
import { PromptManager } from '@rana/prompts';
const pm = new PromptManager({ workspace: 'app' });
await pm.register('translate', {
template: 'Translate to {{language}}: {{text}}',
variables: ['language', 'text']
});
const result = await pm.execute('translate', {
variables: { language: 'Spanish', text: 'Hello world' }
});RAG
Before (LangChain)
import { OpenAIEmbeddings } from '@langchain/openai';
import { PineconeStore } from '@langchain/pinecone';
import { RetrievalQAChain } from 'langchain/chains';
const embeddings = new OpenAIEmbeddings();
const vectorStore = await PineconeStore.fromExistingIndex(embeddings, {
pineconeIndex: index
});
const chain = RetrievalQAChain.fromLLM(chat, vectorStore.asRetriever());
const response = await chain.call({
query: 'What is the refund policy?'
});After (RANA)
import { RAGPresets } from '@rana/rag';
const pipeline = RAGPresets.balanced();
// Index documents
await pipeline.index(documents);
// Query with citations
const result = await pipeline.query({
query: 'What is the refund policy?'
});
console.log(result.answer);
console.log(result.citations);From Vercel AI SDK to RANA
Streaming Chat
Before (Vercel AI SDK)
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
const result = await streamText({
model: openai('gpt-4'),
messages: [
{ role: 'user', content: 'Hello!' }
]
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}After (RANA)
import { Agent } from '@rana/core';
const agent = new Agent({ model: 'gpt-4' });
for await (const chunk of agent.stream('Hello!')) {
process.stdout.write(chunk.content);
}
// Or with React hook
import { useChat } from '@rana/react';
function Chat() {
const { messages, input, send } = useChat();
// Same API, more features included
}Tool Calling
Before (Vercel AI SDK)
import { streamText, tool } from 'ai';
import { z } from 'zod';
const result = await streamText({
model: openai('gpt-4'),
tools: {
weather: tool({
description: 'Get the weather',
parameters: z.object({
city: z.string()
}),
execute: async ({ city }) => {
return `Weather in ${city}: 72°F`;
}
})
},
messages: [{ role: 'user', content: 'Weather in NYC?' }]
});After (RANA)
import { Agent, Tool } from '@rana/core';
const weatherTool = new Tool({
name: 'weather',
description: 'Get the weather',
parameters: {
city: { type: 'string' }
},
handler: async ({ city }) => {
return `Weather in ${city}: 72°F`;
}
});
const agent = new Agent({
model: 'gpt-4',
tools: [weatherTool]
});
const result = await agent.run('Weather in NYC?');From OpenAI SDK to RANA
Basic Chat
Before (OpenAI SDK)
import OpenAI from 'openai';
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
model: 'gpt-4',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
]
});
console.log(completion.choices[0].message.content);After (RANA)
import { Agent } from '@rana/core';
const agent = new Agent({
model: 'gpt-4',
systemPrompt: 'You are a helpful assistant.'
});
const result = await agent.run('Hello!');
console.log(result);
// Benefits:
// - Automatic retries & fallbacks
// - Built-in cost tracking
// - Type-safe responses
// - Memory managementStreaming
Before (OpenAI SDK)
const stream = await openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}After (RANA)
import { Agent } from '@rana/core';
const agent = new Agent({ model: 'gpt-4' });
for await (const chunk of agent.stream('Hello!')) {
process.stdout.write(chunk.content);
}
// Benefits:
// - Unified streaming API across all providers
// - Automatic error handling
// - Progress events & callbacksFunction Calling
Before (OpenAI SDK)
const completion = await openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Weather in NYC?' }],
functions: [{
name: 'get_weather',
parameters: {
type: 'object',
properties: {
city: { type: 'string' }
}
}
}],
function_call: 'auto'
});
if (completion.choices[0].message.function_call) {
const args = JSON.parse(completion.choices[0].message.function_call.arguments);
const weather = await getWeather(args.city);
// Need to send another request with the result...
}After (RANA)
import { Agent, Tool } from '@rana/core';
const agent = new Agent({
model: 'gpt-4',
tools: [
new Tool({
name: 'get_weather',
description: 'Get weather for a city',
parameters: { city: { type: 'string' } },
handler: async ({ city }) => getWeather(city)
})
]
});
// Automatic tool execution loop!
const result = await agent.run('Weather in NYC?');
// Benefits:
// - Automatic tool execution
// - Multi-turn tool usage
// - Type-safe tool definitionsQuick Migration Steps
- 1Install RANA packagesnpm install @rana/core @rana/helpers @rana/prompts @rana/rag
- 2Configure your API keys# .env ANTHROPIC_API_KEY=sk-ant-... OPENAI_API_KEY=sk-...
- 3Update imports and code
Replace imports and update code following the examples above
- 4Run testsnpm test
Need help with your migration?
Join our Discord