Text Streaming
Generate text with real-time streaming output.
Basic Usage
typescript
import { NovelAI } from 'novelai-sdk-unofficial';
const client = new NovelAI({ apiKey: 'your-api-key' });
const stream = client.text.generateStream({
input: 'Once upon a time',
maxLength: 100,
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
console.log(); // New line at endHow It Works
The generateStream method returns an async generator that yields text chunks as they're generated. Each chunk is a string containing one or more tokens.
typescript
const stream = client.text.generateStream({
input: 'Hello',
maxLength: 50,
});
// Chunks arrive as they're generated
for await (const chunk of stream) {
console.log('Received:', JSON.stringify(chunk));
}With All Parameters
Streaming supports all standard generation parameters:
typescript
const stream = client.text.generateStream({
input: 'The story begins',
model: 'llama-3-erato-v1',
temperature: 1.1,
maxLength: 100,
topP: 0.95,
repetitionPenalty: 1.1,
stopSequences: ['\n\n'],
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}Note: Using
stopSequences/banSequencestriggers token-count tokenization before streaming starts.
Collecting Full Output
typescript
async function generateText(input: string): Promise<string> {
const stream = client.text.generateStream({ input, maxLength: 100 });
let fullText = '';
for await (const chunk of stream) {
fullText += chunk;
}
return fullText;
}
const result = await generateText('Once upon a time');
console.log(result);Progress Callback
typescript
async function generateWithCallback(
input: string,
onChunk: (chunk: string, accumulated: string) => void
): Promise<string> {
const stream = client.text.generateStream({ input, maxLength: 100 });
let accumulated = '';
for await (const chunk of stream) {
accumulated += chunk;
onChunk(chunk, accumulated);
}
return accumulated;
}
// Usage
await generateWithCallback('Hello', (chunk, total) => {
console.log(`New: "${chunk}" | Total length: ${total.length}`);
});Error Handling
typescript
import { NetworkError, AuthenticationError } from 'novelai-sdk-unofficial';
try {
const stream = client.text.generateStream({
input: 'Hello',
maxLength: 100,
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
} catch (error) {
if (error instanceof NetworkError) {
console.error('Connection lost during streaming');
} else if (error instanceof AuthenticationError) {
console.error('Invalid API key');
} else {
console.error('Generation failed:', error);
}
}Use Cases
Chat Interface
typescript
async function streamResponse(userMessage: string) {
const context = `User: ${userMessage}\nAssistant:`;
const stream = client.text.generateStream({
input: context,
stopSequences: ['\nUser:', '\n\n'],
maxLength: 150,
});
process.stdout.write('Assistant: ');
for await (const chunk of stream) {
process.stdout.write(chunk);
}
console.log();
}Typewriter Effect
typescript
async function typewriterEffect(input: string, delayMs: number = 50) {
const stream = client.text.generateStream({ input, maxLength: 100 });
for await (const chunk of stream) {
for (const char of chunk) {
process.stdout.write(char);
await new Promise(resolve => setTimeout(resolve, delayMs));
}
}
}Web Server (Express)
typescript
import express from 'express';
const app = express();
app.get('/generate', async (req, res) => {
const input = req.query.input as string;
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
const stream = client.text.generateStream({ input, maxLength: 100 });
for await (const chunk of stream) {
res.write(`data: ${JSON.stringify({ text: chunk })}\n\n`);
}
res.write('data: [DONE]\n\n');
res.end();
});Cancellation
Use AbortController to explicitly cancel streaming generation:
typescript
const controller = new AbortController();
const stream = client.text.generateStream(
{ input: 'Once upon a time', maxLength: 100 },
controller.signal,
);
// Cancel after 5 seconds
setTimeout(() => controller.abort(), 5000);
try {
for await (const chunk of stream) {
process.stdout.write(chunk);
}
} catch (error) {
if (error.name === 'AbortError') {
console.log('\nGeneration cancelled');
} else {
throw error;
}
}Tips
Streaming is ideal for UX - Users see output immediately rather than waiting.
Handle disconnections - Network issues can interrupt streams.
Memory efficient - Streaming doesn't buffer the entire response.
Same parameters - All generation parameters work with streaming.