Streaming Generation
Monitor image generation progress in real-time with streaming. Two streaming formats are supported:
- SSE (Server-Sent Events): Text-based format, default option, good compatibility
- msgpack: Binary format, more efficient, suitable for high-performance scenarios
Basic Usage
typescript
import { NovelAI } from 'novelai-sdk-unofficial';
import { writeFileSync } from 'fs';
const client = new NovelAI({ apiKey: 'your-api-key' });
const stream = client.image.generateStream({
prompt: '1girl, detailed background',
model: 'nai-diffusion-4-5-full',
steps: 28,
});
for await (const chunk of stream) {
if (chunk.event_type === 'intermediate') {
console.log(`Step ${chunk.step_ix}/${chunk.steps}: sigma=${chunk.sigma}`);
// Optionally save intermediate image
const preview = Buffer.from(chunk.image, 'base64');
writeFileSync(`preview_${chunk.step_ix}.png`, preview);
} else if (chunk.event_type === 'final') {
console.log('Generation complete!');
const finalImage = Buffer.from(chunk.image, 'base64');
writeFileSync('final.png', finalImage);
}
}Streaming Format Selection
SSE Format (Default)
typescript
const stream = client.image.generateStream({
prompt: '1girl',
stream: 'sse', // Default, can be omitted
});msgpack Format
msgpack is a binary format with higher transmission efficiency:
typescript
const stream = client.image.generateStream({
prompt: '1girl',
stream: 'msgpack', // Use msgpack format
});
for await (const chunk of stream) {
// Usage is identical to SSE
if (chunk.event_type === 'final') {
const image = Buffer.from(chunk.image, 'base64');
}
}Recommendations:
- Use SSE (default) for general scenarios, easier to debug
- Use msgpack for high-performance/low-latency scenarios
Chunk Types
Intermediate Chunk
Received during generation:
typescript
{
event_type: 'intermediate',
step_ix: number, // Current step (0-indexed)
steps: number, // Total steps
sigma: number, // Current sigma value
image: string, // Base64-encoded preview image
}Final Chunk
Received when generation completes:
typescript
{
event_type: 'final',
image: string, // Base64-encoded final image
}Progress Display
typescript
const stream = client.image.generateStream({
prompt: '1girl',
steps: 28,
});
for await (const chunk of stream) {
if (chunk.event_type === 'intermediate') {
const progress = ((chunk.step_ix + 1) / chunk.steps * 100).toFixed(1);
console.log(`Progress: ${progress}%`);
}
}With All Parameters
Streaming supports all standard generation parameters:
typescript
const stream = client.image.generateStream({
prompt: '1girl, cat ears',
model: 'nai-diffusion-4-5-full',
size: 'portrait',
steps: 28,
scale: 5.0,
sampler: 'k_euler_ancestral',
seed: 12345,
// Advanced features also work
characterReferences: [...],
controlnet: {...},
i2i: {...},
});Error Handling
typescript
try {
const stream = client.image.generateStream({
prompt: '1girl',
});
for await (const chunk of stream) {
// Process chunks...
}
} catch (error) {
if (error instanceof NetworkError) {
console.error('Connection lost during streaming');
} else {
console.error('Generation failed:', error.message);
}
}Use Cases
Progress Bar UI
typescript
async function generateWithProgress(prompt: string) {
const stream = client.image.generateStream({ prompt, steps: 28 });
for await (const chunk of stream) {
if (chunk.event_type === 'intermediate') {
updateProgressBar(chunk.step_ix + 1, chunk.steps);
} else if (chunk.event_type === 'final') {
return Buffer.from(chunk.image, 'base64');
}
}
}Live Preview
typescript
async function generateWithPreview(prompt: string, onPreview: (img: Buffer) => void) {
const stream = client.image.generateStream({ prompt });
for await (const chunk of stream) {
if (chunk.event_type === 'intermediate') {
const preview = Buffer.from(chunk.image, 'base64');
onPreview(preview);
} else if (chunk.event_type === 'final') {
return Buffer.from(chunk.image, 'base64');
}
}
}
// Usage
await generateWithPreview('1girl', (preview) => {
// Update UI with preview image
displayImage(preview);
});Cancellation
Use AbortController to explicitly cancel streaming generation:
typescript
const controller = new AbortController();
const stream = client.image.generateStream(
{ prompt: '1girl' },
controller.signal, // Pass the signal
);
// Cancel after 5 seconds
setTimeout(() => controller.abort(), 5000);
try {
for await (const chunk of stream) {
if (chunk.event_type === 'intermediate') {
console.log(`Step ${chunk.step_ix}/${chunk.steps}`);
} else if (chunk.event_type === 'final') {
console.log('Generation complete');
}
}
} catch (error) {
if (error.name === 'AbortError') {
console.log('Generation cancelled');
} else {
throw error;
}
}Cancellation will:
- Immediately stop receiving data
- Release the network connection
- Throw an
AbortErrorexception
## Tips
1. **Intermediate images are lower quality** - They're meant for preview, not final use.
2. **More steps = more chunks** - Higher step counts produce more intermediate updates.
3. **Network considerations** - Streaming requires a stable connection throughout generation.
4. **Memory usage** - If saving all intermediates, be mindful of memory/disk usage.