Batch Processing
Process multiple verifications efficiently with the Check Batch API.
Batch Processing
The Batch API allows you to verify multiple pieces of content in a single request. This is ideal for bulk content moderation, historical data verification, and automated pipelines.
Batch processing is available on Pro and Enterprise plans.
How It Works
- Upload (optional) - Upload a CSV or JSON file via
/v1/batch/upload - Create - Create a batch job via
/v1/batchwith items or a file reference - Process - Items are verified asynchronously in parallel
- Retrieve - Poll for status or use webhooks, then fetch results when complete
Batch Lifecycle
| Status | Description |
|---|---|
pending | Batch created, awaiting processing |
validating | Parsing and validating input items |
processing | Items actively being verified |
completed | All items processed successfully |
failed | Batch-level failure (validation errors) |
cancelled | User cancelled the batch |
Limits
| Limit | Free | Pro | Enterprise |
|---|---|---|---|
| Items per batch | N/A | 1,000 | 10,000 |
| File upload size | N/A | 10 MB | 100 MB |
| Content per item | N/A | 50,000 chars | 50,000 chars |
| File reference expiry | N/A | 1 hour | 24 hours |
| Results retention | N/A | 30 days | 90 days |
Available Endpoints
| Method | Endpoint | Description |
|---|---|---|
POST | /v1/batch | Create a batch job |
GET | /v1/batch | List batch jobs |
GET | /v1/batch/:id | Get batch status |
GET | /v1/batch/:id/results | Get batch results |
POST | /v1/batch/:id/cancel | Cancel a batch |
POST | /v1/batch/upload | Upload batch file |
Quick Example
Response
{
"id": "batch_abc123def456",
"status": "processing",
"name": "Article verification batch",
"totalItems": 3,
"processedItems": 0,
"successCount": 0,
"failureCount": 0,
"methods": { "reasoning": 1.0, "tool": 0.5 },
"createdAt": "2024-01-15T10:30:00Z",
"message": "Batch created successfully. Items are being processed."
}Uploading Files
For large batches, upload a file first:
Real-time Progress Updates
Subscribe to batch progress via webhooks or polling:
Using Webhooks (Recommended)
// Create batch with webhook
const batch = await client.createBatch({
name: 'Large batch',
items: largeItemList,
methods: { reasoning: 1.0 },
webhookUrl: 'https://your-app.com/webhooks/check'
});
// Webhook events:
// - batch.progress (every 10% or 100 items)
// - batch.completed
// - batch.failedUsing Polling
// Create batch
const batch = await client.createBatch({
name: 'Large batch',
items: largeItemList,
methods: { reasoning: 1.0 }
});
// Poll for progress
let status = await client.getBatch(batch.id);
while (status.status === 'pending' || status.status === 'processing') {
console.log(`Progress: ${status.processedItems}/${status.totalItems}`);
await new Promise(r => setTimeout(r, 5000)); // Wait 5 seconds
status = await client.getBatch(batch.id);
}
console.log(`Completed: ${status.successCount} succeeded, ${status.failureCount} failed`);Using Supabase Realtime
For real-time updates in your UI:
import { createClient } from '@supabase/supabase-js';
const supabase = createClient(SUPABASE_URL, SUPABASE_KEY);
// Subscribe to batch updates
const channel = supabase.channel(`batch:${batchId}`);
channel.on('broadcast', { event: 'progress' }, ({ payload }) => {
console.log(`Progress: ${payload.processedItems}/${payload.totalItems}`);
console.log(`Current item: ${payload.currentItem}`);
});
channel.on('broadcast', { event: 'item_completed' }, ({ payload }) => {
console.log(`Item ${payload.itemId}: ${payload.verdict}`);
});
channel.on('broadcast', { event: 'completed' }, ({ payload }) => {
console.log(`Batch completed: ${payload.successCount} succeeded`);
channel.unsubscribe();
});
channel.subscribe();Processing Results
// Get all results
const results = await client.getBatchResults(batch.id);
// Results are paginated
console.log(results.data); // Array of verification results
console.log(results.pagination.total); // Total items
console.log(results.pagination.hasMore); // More pages available
// Filter by verdict
const falseResults = await client.getBatchResults(batch.id, {
verdict: 'false'
});
// Export results
const csvExport = await client.exportBatchResults(batch.id, 'csv');Best Practices
1. Use File Upload for Large Batches
// For 100+ items, upload a file instead of inline items
if (items.length > 100) {
const csv = items.map(i => i.content).join('\n');
const file = await client.uploadBatchFile(csv, 'batch.csv');
await client.createBatch({ fileId: file.id, ... });
}2. Set Webhooks for Long-Running Batches
// Don't poll - use webhooks
await client.createBatch({
items: manyItems,
webhookUrl: process.env.WEBHOOK_URL
});3. Handle Partial Failures
const batch = await client.getBatch(batchId);
if (batch.failureCount > 0) {
const failures = await client.getBatchResults(batchId, {
status: 'failed'
});
for (const failure of failures.data) {
console.error(`Failed: ${failure.content.slice(0, 50)}...`);
console.error(`Reason: ${failure.error}`);
}
}4. Use Idempotency Keys
// Prevent duplicate batches on retry
await client.createBatch({
items: items,
methods: { reasoning: 1.0 }
}, {
idempotencyKey: `batch-${jobId}-${date}`
});Next Steps
- Create a Batch - Full API reference
- File Formats - CSV and JSON specifications
- Get Results - Retrieve batch results
- Webhooks - Real-time notifications