Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.mavera.io/llms.txt

Use this file to discover all available pages before exploring further.

When to Use This

You’re building a React (or Next.js) chat interface and want to:
  • Stream responses character-by-character for better perceived performance
  • Keep the API key server-side (via API route or backend proxy)
  • Handle loading states, errors, and cancellation cleanly
  • Support multiple frameworks — works with Create React App, Vite, Next.js App Router
This cookbook gives you a production-ready streaming chat component with a custom hook, error boundaries, and optional enhancements (markdown rendering, stop button, retry).

Architecture

┌─────────────────────────────────────────────────────────────────┐
│  React Component                                                │
│  useStreamingChat(messages, personaId)                           │
│  → { send, content, isLoading, error }                           │
└────────────────────────────┬────────────────────────────────────┘
                             │ fetch('/api/chat', { body, stream })

┌─────────────────────────────────────────────────────────────────┐
│  API Route (Next.js) or Backend                                  │
│  Proxies to Mavera with streaming                                │
│  Returns ReadableStream of text chunks                           │
└────────────────────────────┬────────────────────────────────────┘


┌─────────────────────────────────────────────────────────────────┐
│  Mavera Responses API                                            │
│  POST /responses  stream=true                                    │
│  Server-Sent Events (SSE)                                       │
└─────────────────────────────────────────────────────────────────┘

Prerequisites

React 18+ (for useId, concurrent features) or Next.js 14+
Backend or API route that proxies to Mavera with streaming and returns a ReadableStream
Mavera API key stored server-side

Step 1: API Route (Next.js App Router)

Your API route receives messages and personaId, calls Mavera with streaming, and forwards the stream to the client.
// app/api/chat/route.ts
import { NextRequest } from "next/server";
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.MAVERA_API_KEY,
  baseURL: "https://app.mavera.io/api/v1",
});

export async function POST(req: NextRequest) {
  const { messages, personaId } = await req.json();

  if (!messages?.length || !personaId) {
    return new Response(JSON.stringify({ error: "messages and personaId required" }), {
      status: 400,
    });
  }

  const stream = client.responses.stream({
    model: "mavera-1",
    input: messages,
    persona_id: personaId,
  });

  const encoder = new TextEncoder();
  const readable = new ReadableStream({
    async start(controller) {
      try {
        for await (const event of stream) {
          if (event.type === "response.output_text.delta") {
            controller.enqueue(encoder.encode(event.delta));
          }
        }
      } finally {
        controller.close();
      }
    },
  });

  return new Response(readable, {
    headers: { "Content-Type": "text/plain; charset=utf-8" },
  });
}

Step 2: Custom Hook — useStreamingChat

A reusable hook that manages the streaming request, accumulates content, and exposes loading/error state.
// hooks/useStreamingChat.ts
import { useState, useCallback } from "react";

type Message = { role: "user" | "assistant" | "system"; content: string };

type UseStreamingChatOptions = {
  apiUrl?: string;
  onError?: (err: Error) => void;
};

export function useStreamingChat(options: UseStreamingChatOptions = {}) {
  const { apiUrl = "/api/chat", onError } = options;
  const [content, setContent] = useState("");
  const [isLoading, setIsLoading] = useState(false);
  const [error, setError] = useState<Error | null>(null);

  const send = useCallback(
    async (messages: Message[], personaId: string) => {
      setError(null);
      setContent("");
      setIsLoading(true);

      try {
        const resp = await fetch(apiUrl, {
          method: "POST",
          headers: { "Content-Type": "application/json" },
          body: JSON.stringify({ messages, personaId }),
        });

        if (!resp.ok) {
          const errData = await resp.json().catch(() => ({}));
          throw new Error(errData.error || `HTTP ${resp.status}`);
        }

        const reader = resp.body?.getReader();
        const decoder = new TextDecoder();
        let fullContent = "";

        if (!reader) throw new Error("No response body");

        while (true) {
          const { done, value } = await reader.read();
          if (done) break;
          const chunk = decoder.decode(value, { stream: true });
          fullContent += chunk;
          setContent(fullContent);
        }

        return fullContent;
      } catch (err) {
        const e = err instanceof Error ? err : new Error(String(err));
        setError(e);
        onError?.(e);
        throw e;
      } finally {
        setIsLoading(false);
      }
    },
    [apiUrl, onError]
  );

  const reset = useCallback(() => {
    setContent("");
    setError(null);
  }, []);

  return { send, content, isLoading, error, reset };
}

Step 3: Chat UI Component

A complete chat interface using the hook: persona selector, message list, input, streaming display, and error handling.
// components/StreamingChat.tsx
"use client";

import { useEffect, useState, useRef } from "react";
import { useStreamingChat } from "../hooks/useStreamingChat";

type Message = { role: "user" | "assistant"; content: string };
type Persona = { id: string; name: string };

export function StreamingChat() {
  const [personas, setPersonas] = useState<Persona[]>([]);
  const [selectedPersona, setSelectedPersona] = useState("");
  const [messages, setMessages] = useState<Message[]>([]);
  const [input, setInput] = useState("");
  const endRef = useRef<HTMLDivElement>(null);
  const { send, content, isLoading, error } = useStreamingChat();

  useEffect(() => {
    fetch("/api/personas")
      .then((r) => r.json())
      .then((data) => {
        if (Array.isArray(data)) {
          setPersonas(data);
          if (data[0]) setSelectedPersona(data[0].id);
        }
      });
  }, []);

  useEffect(() => {
    endRef.current?.scrollIntoView({ behavior: "smooth" });
  }, [messages, content]);

  const handleSend = async () => {
    if (!input.trim() || !selectedPersona || isLoading) return;

    const userMsg: Message = { role: "user", content: input.trim() };
    const newMessages = [...messages, userMsg];
    setMessages(newMessages);
    setInput("");

    const apiMessages = newMessages.map((m) => ({ role: m.role, content: m.content }));

    try {
      const fullContent = await send(apiMessages, selectedPersona);
      setMessages((prev) => [...prev, { role: "assistant", content: fullContent }]);
    } catch {
      // Error already set in hook
    }
  };

  return (
    <div className="flex flex-col max-w-2xl mx-auto h-[500px]">
      <select
        value={selectedPersona}
        onChange={(e) => setSelectedPersona(e.target.value)}
        className="border rounded px-3 py-2 mb-4"
      >
        {personas.map((p) => (
          <option key={p.id} value={p.id}>{p.name}</option>
        ))}
      </select>

      <div className="flex-1 overflow-y-auto border rounded p-4 bg-gray-50 space-y-4">
        {messages.map((m, i) => (
          <div
            key={i}
            className={`p-3 rounded ${m.role === "user" ? "bg-blue-100 ml-8" : "bg-white mr-8 border"}`}
          >
            <strong>{m.role === "user" ? "You" : "Assistant"}:</strong>
            <div className="whitespace-pre-wrap mt-1">{m.content}</div>
          </div>
        ))}
        {content && (
          <div className="p-3 rounded bg-white mr-8 border">
            <strong>Assistant:</strong>
            <div className="whitespace-pre-wrap mt-1">
              {content}
              <span className="animate-pulse"></span>
            </div>
          </div>
        )}
        {error && (
          <div className="p-3 rounded bg-red-50 text-red-700">
            Error: {error.message}
          </div>
        )}
        <div ref={endRef} />
      </div>

      <div className="flex gap-2 mt-4">
        <input
          type="text"
          value={input}
          onChange={(e) => setInput(e.target.value)}
          onKeyDown={(e) => e.key === "Enter" && !e.shiftKey && handleSend()}
          placeholder="Type a message..."
          className="flex-1 border rounded px-3 py-2"
          disabled={isLoading}
        />
        <button
          onClick={handleSend}
          disabled={isLoading || !input.trim()}
          className="px-4 py-2 bg-orange-500 text-white rounded disabled:opacity-50"
        >
          {isLoading ? "..." : "Send"}
        </button>
      </div>
    </div>
  );
}

Step 4: Vite / Create React App (No API Route)

If you don’t have Next.js, use a backend proxy or call Mavera from a separate backend service. Never put your API key in client-side code. Options:
  1. Express/Fastify proxy — Your React app calls localhost:3001/api/chat; your backend proxies to Mavera.
  2. Serverless function — Vercel, Netlify, or AWS Lambda function that receives the request and calls Mavera.
Example backend (Express):
// server.js
const express = require("express");
const OpenAI = require("openai").default;

const app = express();
app.use(express.json());

const client = new OpenAI({
  apiKey: process.env.MAVERA_API_KEY,
  baseURL: "https://app.mavera.io/api/v1",
});

app.post("/api/chat", async (req, res) => {
  const { messages, personaId } = req.body;

  const stream = client.responses.stream({
    model: "mavera-1",
    input: messages,
    persona_id: personaId,
  });

  res.setHeader("Content-Type", "text/plain; charset=utf-8");
  for await (const event of stream) {
    if (event.type === "response.output_text.delta") {
      res.write(event.delta);
    }
  }
  res.end();
});

app.listen(3001, () => console.log("Proxy on :3001"));
Point your React app at http://localhost:3001/api/chat via apiUrl in the hook.

Step 5: AbortController — Stop Streaming

Allow users to cancel an in-progress request:
// In useStreamingChat, add AbortController support
const abortRef = useRef<AbortController | null>(null);

const send = useCallback(async (messages, personaId) => {
  abortRef.current?.abort();
  abortRef.current = new AbortController();

  const resp = await fetch(apiUrl, {
    method: "POST",
    signal: abortRef.current.signal,
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify({ messages, personaId }),
  });
  // ... rest of stream reading
}, [apiUrl]);

const stop = useCallback(() => {
  abortRef.current?.abort();
}, []);

return { send, stop, content, isLoading, error, reset };
In the UI, show a “Stop” button when isLoading and call stop() on click.

Step 6: Markdown Rendering

For richer output, render assistant content as Markdown. Use react-markdown:
npm install react-markdown
import ReactMarkdown from "react-markdown";

// In message display:
{m.role === "assistant" ? (
  <ReactMarkdown className="prose">{m.content}</ReactMarkdown>
) : (
  <div className="whitespace-pre-wrap">{m.content}</div>
)}

Common Pitfalls

Ensure your backend doesn’t buffer the response. In Node, avoid res.json(); use res.write() for each chunk. Set Content-Type: text/plain so the client treats it as a stream.
Enable CORS on your proxy (Access-Control-Allow-Origin, Access-Control-Allow-Headers). Or use same-origin (API route + React on same host).
Check isMounted or use AbortController before calling setContent in the stream loop. Or wrap in a useEffect cleanup.
Use decoder.decode(value, { stream: true }) so multi-byte UTF-8 characters aren’t split across chunks.

See Also

Build Persona Chat App

Full Next.js tutorial with streaming

Responses API

Streaming, tools, analysis mode

Quickstart: Chat

First streaming request

API Reference

Responses API spec