Your First RANA Project
Let's build a complete AI chat application from scratch. We'll create both the backend API and the React frontend to understand how all the pieces fit together.
What We're Building
A simple but production-ready chat application with:
- Streaming responses for instant feedback
- Conversation history management
- Error handling and loading states
- A clean, responsive UI
Step 1: Create the Agent
First, let's define our agent configuration:
// lib/agent.ts
import { Agent } from '@rana/core';
export const chatAgent = new Agent({
name: 'ChatAssistant',
model: 'claude-sonnet-4-20250514',
systemPrompt: `You are a helpful, friendly assistant.
Be concise but thorough in your responses.
If you don't know something, say so honestly.`,
temperature: 0.7,
maxTokens: 1000
});Step 2: Create the API Route
Next, create the API endpoint that handles chat requests:
// app/api/chat/route.ts
import { chatAgent } from '@/lib/agent';
import { streamResponse } from '@rana/helpers';
export async function POST(request: Request) {
try {
const { messages } = await request.json();
// Validate input
if (!messages || !Array.isArray(messages)) {
return Response.json(
{ error: 'Messages array is required' },
{ status: 400 }
);
}
// Stream the response
const stream = chatAgent.stream(messages);
return streamResponse(stream);
} catch (error) {
console.error('Chat error:', error);
return Response.json(
{ error: 'Failed to process chat request' },
{ status: 500 }
);
}
}Step 3: Create the Chat Component
Now let's build the React component:
// components/Chat.tsx
'use client';
import { useChat } from '@rana/react';
import { useState } from 'react';
export function Chat() {
const {
messages,
input,
setInput,
send,
isLoading,
error,
stop
} = useChat({
api: '/api/chat',
onError: (error) => {
console.error('Chat error:', error);
}
});
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
if (input.trim() && !isLoading) {
send();
}
};
return (
<div className="flex flex-col h-[600px] max-w-2xl mx-auto">
{/* Messages */}
<div className="flex-1 overflow-auto p-4 space-y-4">
{messages.length === 0 && (
<div className="text-center text-gray-500 py-8">
Start a conversation by typing a message below
</div>
)}
{messages.map((message) => (
<div
key={message.id}
className={`flex ${
message.role === 'user' ? 'justify-end' : 'justify-start'
}`}
>
<div
className={`max-w-[80%] rounded-lg px-4 py-2 ${
message.role === 'user'
? 'bg-blue-600 text-white'
: 'bg-gray-100 dark:bg-gray-800'
}`}
>
{message.content}
</div>
</div>
))}
{isLoading && (
<div className="flex justify-start">
<div className="bg-gray-100 dark:bg-gray-800 rounded-lg px-4 py-2">
<span className="animate-pulse">Thinking...</span>
</div>
</div>
)}
{error && (
<div className="text-red-500 text-center">
Error: {error.message}
</div>
)}
</div>
{/* Input */}
<form onSubmit={handleSubmit} className="p-4 border-t">
<div className="flex gap-2">
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type a message..."
className="flex-1 px-4 py-2 rounded-lg border focus:outline-none focus:ring-2"
disabled={isLoading}
/>
{isLoading ? (
<button
type="button"
onClick={stop}
className="px-4 py-2 bg-red-600 text-white rounded-lg"
>
Stop
</button>
) : (
<button
type="submit"
disabled={!input.trim()}
className="px-4 py-2 bg-blue-600 text-white rounded-lg disabled:opacity-50"
>
Send
</button>
)}
</div>
</form>
</div>
);
}Step 4: Add to Your Page
// app/page.tsx
import { Chat } from '@/components/Chat';
export default function Home() {
return (
<main className="min-h-screen p-8">
<h1 className="text-3xl font-bold text-center mb-8">
AI Chat Assistant
</h1>
<Chat />
</main>
);
}Step 5: Test Your Application
Start the development server:
npm run devOpen http://localhost:3000 and try chatting with your AI assistant!
How It Works
- User types a message - The input is captured by the useChat hook
- Form submitted - send() is called, which sends a POST request to /api/chat
- API processes request - The agent receives the messages and streams a response
- Response streams back - The hook updates messages state as chunks arrive
- UI updates in real-time - React re-renders to show the streaming response
Adding Features
Persist Conversation History
const { messages } = useChat({
api: '/api/chat',
initialMessages: loadSavedMessages(),
onFinish: (message) => {
saveMessages([...messages, message]);
}
});Add a System Message Toggle
const [mode, setMode] = useState<'helpful' | 'creative'>('helpful');
const systemPrompts = {
helpful: 'You are a helpful, precise assistant.',
creative: 'You are a creative, imaginative assistant.'
};
// Pass to your API and update agent system promptWhat's Next?
Congratulations! You've built your first RANA application. In the next lesson, we'll dive deeper into the LLM client and explore advanced configuration options.