Skip to content

Index multiple objects

Batch importing multiple objects

This endpoint lets you efficiently index multiple objects into Tallyfy Answers in a single API call. It’s much faster than making individual object creation requests, especially for large datasets or initial data loading.

Large batch imports run in the background. You can monitor progress and cancel operations if needed using the Tasks endpoints.

API endpoint

POST /collections/{collection_name}/batch

Request parameters

ParameterTypeRequiredDescription
collection_namepathYesName of the collection to import objects into

Request body format: JSON array

The request body must be a JSON array containing multiple document objects. Each object should include the required fields for your collection schema.

Here’s an example JSON array format for two objects:

[
{
"uid": "article-123",
"title": "Getting Started Guide",
"content": "Complete guide to getting started...",
"url": "https://example.com/guide",
"source": "Documentation",
"snippet": "Quick start guide for new users"
},
{
"uid": "article-456",
"title": "Advanced Features",
"content": "Advanced functionality overview...",
"url": "https://example.com/advanced",
"source": "Documentation",
"snippet": "Deep dive into advanced capabilities"
}
]

Content-Type header

Set the Content-Type header to:

  • application/json

Response

A successful request returns a 200 OK status code and a summary of the import operation:

{
"success": true,
"count": 3,
"errors": []
}

If some objects fail to import while others succeed, you’ll receive a 207 Multi-Status[^1] response:

{
"success": true,
"count": 2,
"errors": [
{
"line": 2,
"error": "Invalid object format at line 2",
"object": "{malformed json}"
}
]
}

Error scenarios

StatusDescription
400Invalid JSONLines format or request body
404Collection not found
413Request body too large

Example request using curl

Terminal window
curl -X POST "https://answers.tallyfy.com/collections/docs/batch" \
-H "Authorization: Bearer your_api_key" \
-H "Content-Type: application/json" \
-d '[
{
"uid": "guide-123",
"title": "Getting Started",
"content": "Complete guide to getting started with our platform...",
"url": "https://example.com/guide",
"source": "Documentation",
"snippet": "Quick start guide for new users"
}
]'

Best practices for batch imports

  • Size limitations: Keep individual requests under 5MB
  • Chunking: For large datasets, break imports into smaller batches of 100-1000 objects1
  • ID handling: You can mix objects with and without custom IDs in the same import
  • Error handling: Process the errors array in the response to identify and fix failed objects
  • Parallel imports: For very large datasets, you can make multiple parallel import requests to different collections

When to use batch import

  • Initial data population
  • Regular data synchronization from external systems
  • Bulk updates to multiple objects
  • Periodic content refreshes

Answers > Objects

Objects are JSON data records in Tallyfy Answers that belong to collections. They can be…

Footnotes

  1. Optimal range balances memory usage, network timeout risk, and API processing efficiency