developers/Streaming Chat
Streaming Chat
The /chat endpoint supports streaming via NDJSON (newline-delimited JSON). Each line in the response is a self-contained JSON object representing a chunk of the AI's response — text tokens, product cards, follow-up prompts, and more.
Why Streaming?
| Regular Chat | Streaming Chat |
|---|---|
| Wait for the complete response | See text appear in real-time |
| 3–10 second perceived delay | Immediate feedback |
| Simple implementation | More engaging UX |
Basic Request
curl -N -X POST https://api.intufind.com/chat \
-H "Authorization: Bearer if_sk_xxx" \
-H "Content-Type: application/json" \
-d '{"message": "Tell me about your products", "threadId": "session-123"}'
The response is a stream of JSON objects, one per line:
{"type":"text_delta","data":"I'd be happy to "}
{"type":"text_delta","data":"help you find "}
{"type":"text_delta","data":"the right product!"}
{"type":"product","data":{"id":"p-1","name":"Wireless Headphones","price":199.99}}
{"type":"prompts","data":["Show me more options","What's on sale?"]}
{"type":"complete","data":{}}
Event Types
| Type | Data | Description |
|---|---|---|
text_delta | string | Streaming text token |
product | Product object | Product recommendation |
post | Post object | Content/article recommendation |
post_delta | {id, delta} | Streaming post summary update |
prompts | string[] | Suggested follow-up prompts |
domain_offer | Offer object | Domain offer (e.g., live agent handoff) |
domain_offer_success | Result object | Domain offer accepted |
progress | Progress object | Progress indicator |
bubble_termination | {} | Start new message bubble |
complete | {} | Stream finished |
error | {error: string} | Error message |
JavaScript Implementation
Reading the Stream
import { createClient, sendChat } from '@intufind/ai-sdk';
const client = createClient({
baseUrl: 'https://api.intufind.com',
auth: process.env.INTUFIND_SECRET_KEY!,
});
const response = await sendChat({
client,
body: { message: 'Tell me about your products', threadId: 'session-123' },
});
const reader = response.response.body?.getReader();
const decoder = new TextDecoder();
let buffer = '';
while (reader) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split('\n');
buffer = lines.pop() ?? '';
for (const line of lines) {
if (!line.trim()) continue;
const chunk = JSON.parse(line);
switch (chunk.type) {
case 'text_delta':
process.stdout.write(chunk.data);
break;
case 'product':
console.log('\nProduct:', chunk.data.name);
break;
case 'complete':
console.log('\nDone');
break;
}
}
}
React Hook
import { useState, useCallback } from 'react';
interface Message {
id: string;
role: 'user' | 'assistant';
content: string;
products?: any[];
isStreaming?: boolean;
}
export function useStreamingChat(apiKey: string, threadId: string) {
const [messages, setMessages] = useState<Message[]>([]);
const [isStreaming, setIsStreaming] = useState(false);
const sendMessage = useCallback(async (text: string) => {
const userMsg: Message = {
id: `user-${Date.now()}`,
role: 'user',
content: text,
};
const assistantId = `assistant-${Date.now()}`;
setMessages((prev) => [
...prev,
userMsg,
{ id: assistantId, role: 'assistant', content: '', isStreaming: true },
]);
setIsStreaming(true);
try {
const res = await fetch('https://api.intufind.com/chat', {
method: 'POST',
headers: {
Authorization: `Bearer ${apiKey}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({ message: text, threadId }),
});
const reader = res.body?.getReader();
const decoder = new TextDecoder();
let buffer = '';
while (reader) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split('\n');
buffer = lines.pop() ?? '';
for (const line of lines) {
if (!line.trim()) continue;
const chunk = JSON.parse(line);
if (chunk.type === 'text_delta') {
setMessages((prev) =>
prev.map((m) =>
m.id === assistantId
? { ...m, content: m.content + chunk.data }
: m
)
);
} else if (chunk.type === 'product') {
setMessages((prev) =>
prev.map((m) =>
m.id === assistantId
? { ...m, products: [...(m.products ?? []), chunk.data] }
: m
)
);
} else if (chunk.type === 'complete') {
setMessages((prev) =>
prev.map((m) =>
m.id === assistantId ? { ...m, isStreaming: false } : m
)
);
}
}
}
} catch (err) {
console.error('Streaming error:', err);
} finally {
setIsStreaming(false);
}
}, [apiKey, threadId]);
return { messages, isStreaming, sendMessage };
}
React Component
function ChatWindow() {
const { messages, isStreaming, sendMessage } = useStreamingChat(
'if_pk_xxx',
'session-123'
);
const [input, setInput] = useState('');
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
if (input.trim() && !isStreaming) {
sendMessage(input);
setInput('');
}
};
return (
<div className="chat-container">
<div className="messages">
{messages.map((msg) => (
<div key={msg.id} className={`message ${msg.role}`}>
{msg.content}
{msg.isStreaming && <span className="cursor">▋</span>}
</div>
))}
</div>
<form onSubmit={handleSubmit}>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type a message..."
disabled={isStreaming}
/>
<button type="submit" disabled={isStreaming}>Send</button>
</form>
</div>
);
}
Server-Side Proxy
For security, proxy streaming requests through your backend so the secret key stays server-side:
Next.js Route Handler
// app/api/chat/route.ts
export async function POST(request: Request) {
const { message, threadId } = await request.json();
const upstream = await fetch('https://api.intufind.com/chat', {
method: 'POST',
headers: {
Authorization: `Bearer ${process.env.INTUFIND_SECRET_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({ message, threadId }),
});
return new Response(upstream.body, {
headers: {
'Content-Type': 'application/x-ndjson',
'Cache-Control': 'no-cache',
},
});
}
Styling the Streaming Cursor
.cursor {
display: inline-block;
animation: blink 1s infinite;
margin-left: 2px;
}
@keyframes blink {
0%, 50% { opacity: 1; }
51%, 100% { opacity: 0; }
}
Best Practices
- Show a typing indicator while waiting for the first chunk
- Implement a stop button for long responses
- Auto-scroll as content appears
- Handle reconnection for network interruptions
- Cache thread context to resume after errors
- Rate-limit sends on the client to prevent abuse
Next Steps
- AI Chat — Chat customization and thread management
- API Reference — Full
/chatendpoint schema