Intermediate

useChat Hook

The useChat hook is the core of the AI SDK for building chat interfaces. It manages message state, streaming, loading indicators, error handling, and API communication automatically.

Basic Chat Interface

TypeScript - Full Chat Component
'use client';

import { useChat } from 'ai/react';

export default function ChatPage() {
  const {
    messages,          // Message[]
    input,             // Current input value
    handleInputChange, // Input onChange handler
    handleSubmit,      // Form onSubmit handler
    isLoading,         // Is AI responding?
    stop,              // Stop generation
    reload,            // Regenerate last response
    error,             // Error object
  } = useChat();

  return (
    <div className="chat-container">
      <div className="messages">
        {messages.map(m => (
          <div key={m.id} className={`message ${m.role}`}>
            <strong>{m.role === 'user' ? 'You' : 'AI'}:</strong>
            <p>{m.content}</p>
          </div>
        ))}
      </div>

      {error && <div className="error">{error.message}</div>}

      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Type a message..."
          disabled={isLoading}
        />
        {isLoading
          ? <button type="button" onClick={stop}>Stop</button>
          : <button type="submit">Send</button>
        }
      </form>

      {messages.length > 0 && (
        <button onClick={() => reload()}>Regenerate</button>
      )}
    </div>
  );
}

useChat Options

TypeScript
const { messages, ... } = useChat({
  api: '/api/chat',                // Custom endpoint
  id: 'my-chat',                   // Unique chat ID
  initialMessages: [],              // Pre-loaded messages
  body: { model: 'gpt-4o' },       // Extra body params
  headers: { 'X-API-Key': key },   // Custom headers
  onFinish: (message) => {          // Called when done
    console.log('Complete:', message);
  },
  onError: (error) => {             // Error handler
    console.error(error);
  },
});

System Messages

TypeScript - Server Route
export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: openai('gpt-4o'),
    system: 'You are a helpful coding assistant. Respond concisely.',
    messages,
    maxTokens: 1000,
    temperature: 0.7,
  });

  return result.toDataStreamResponse();
}

Streaming UI Patterns

TypeScript - Markdown Rendering
import ReactMarkdown from 'react-markdown';

{messages.map(m => (
  <div key={m.id}>
    {m.role === 'assistant' ? (
      <ReactMarkdown>{m.content}</ReactMarkdown>
    ) : (
      <p>{m.content}</p>
    )}
  </div>
))}
Auto-scroll tip: Use a ref on the message container and scroll to bottom whenever messages changes. The streaming updates will auto-scroll as new tokens arrive.

What's Next?

Next, let's explore useCompletion for single-turn text generation use cases.