Webhooks > Set up webhooks in Tallyfy
Handling idempotency in webhooks and API
When integrating with Tallyfy’s webhooks and API, you need to design your systems to handle events that might occur multiple times. Idempotency means that performing the same operation multiple times produces the same result as performing it once. This is essential for building reliable integrations.
In Tallyfy, certain actions can trigger webhooks multiple times, and external systems might send duplicate API requests. Without proper idempotency handling, you could end up with duplicate records, incorrect data states, or failed operations.
Webhooks in Tallyfy can fire more than once for the same event. Here’s a common example:
- A user completes a task
- Your webhook receives the “task completed” event
- The user reopens the same task
- The user completes it again
- Your webhook receives another “task completed” event for the same task
Without idempotency handling, your system might:
- Create duplicate records in your database
- Send multiple notifications for the same event
- Process payments or orders twice
- Update inventory counts incorrectly
When you configure webhooks at the process level, they fire for every task completion within that process. This means:
- A process with 10 tasks will generate 10 webhook events
- If tasks are reopened and completed again, you’ll receive additional events
- Your receiving system needs to handle this volume efficiently
Your external systems might send duplicate events to Tallyfy through the API. For example:
- A helpdesk system sends two webhook notifications for the same ticket update
- Network issues cause a retry mechanism to send the same API request twice
- A user accidentally triggers the same action multiple times in your integrated system
Every webhook payload from Tallyfy includes unique identifiers. Always use these to detect duplicates:
{ "event": "task.completed", "task_id": "abc123", "process_id": "xyz789", "completed_at": "2024-01-15T10:30:00Z", "completed_by": "user@example.com"}
Implementation approach:
- Store the
task_id
andcompleted_at
timestamp in your database - Before processing a webhook, check if this combination already exists
- If it exists, skip processing or update the existing record
- If it’s new, process the webhook normally
Create a deduplication table or cache to track processed events:
CREATE TABLE processed_events ( event_id VARCHAR(255) PRIMARY KEY, event_type VARCHAR(100), processed_at TIMESTAMP, payload JSON);
When receiving a webhook:
- Generate a unique event ID combining
task_id + event_type + timestamp
- Check if this event ID exists in your table
- If not, process the event and store the ID
- If it exists, log it and skip processing
When handling duplicate API requests from external systems:
-
Return success for duplicate requests: If an external system tries to create the same record twice, return a success response with the existing record instead of an error
-
Use conditional updates: When updating form fields via API, check the current value first:
- If the value is already what you want to set, skip the update
- If it’s different, proceed with the update
- Add a comment noting the update for audit trails
-
Implement request IDs: Require external systems to send a unique request ID with each API call:
X-Request-ID: unique-request-identifier-123Store these IDs temporarily (e.g., for 24 hours) to detect retries
When a task is completed and potentially reopened:
-
Store the task’s completion history:
{"task_id": "abc123","completions": [{"completed_at": "2024-01-15T10:30:00Z", "completed_by": "user1@example.com"},{"completed_at": "2024-01-15T14:45:00Z", "completed_by": "user2@example.com"}]} -
Decide on your business logic:
- Process only the first completion
- Process all completions but track them separately
- Process the most recent completion only
For processes with many tasks:
- Batch processing: Collect webhook events and process them in batches every few minutes
- Use queues: Implement a message queue to handle high volumes without overwhelming your system
- Filter by task type: Use the webhook payload to identify specific tasks you care about
When external systems integrate with Tallyfy:
-
Implement idempotency keys: Generate a unique key for each operation:
POST /api/v1/processes/launchX-Idempotency-Key: ticket-12345-launch-attempt-1 -
Use conditional requests: Check if an action has already been performed:
- Query existing processes before launching a new one
- Check task status before attempting to complete it
- Verify form field values before updating them
async function processWebhook(payload) { // Generate unique event key const eventKey = `${payload.task_id}-${payload.event}-${payload.timestamp}`;
// Check if already processed const existing = await db.query('SELECT * FROM processed_events WHERE event_key = ?', [eventKey]);
if (existing.length > 0) { console.log('Duplicate event detected, skipping:', eventKey); return { status: 'duplicate', message: 'Event already processed' }; }
// Process the event await handleEvent(payload);
// Mark as processed await db.query('INSERT INTO processed_events (event_key, processed_at) VALUES (?, NOW())', [eventKey]);
return { status: 'processed', message: 'Event processed successfully' };}
async function updateTaskField(taskId, fieldName, fieldValue, requestId) { // Check if this request was already processed const cachedResult = await cache.get(`request:${requestId}`); if (cachedResult) { return cachedResult; }
// Get current task state const task = await tallyfyApi.getTask(taskId);
// Check if update is needed if (task.fields[fieldName] === fieldValue) { const result = { status: 'unchanged', message: 'Field already has the desired value' }; await cache.set(`request:${requestId}`, result, 86400); // Cache for 24 hours return result; }
// Perform update const updatedTask = await tallyfyApi.updateTask(taskId, { fields: { [fieldName]: fieldValue } });
// Add comment for audit trail await tallyfyApi.addComment(taskId, `Field "${fieldName}" updated to "${fieldValue}" via API integration` );
const result = { status: 'updated', task: updatedTask }; await cache.set(`request:${requestId}`, result, 86400); return result;}
To ensure your integration handles duplicates correctly:
- Simulate duplicate webhooks: Manually trigger the same webhook multiple times
- Test network retries: Use tools to simulate failed requests that retry automatically
- Verify data consistency: Check that your data remains correct after duplicate events
- Monitor logs: Look for patterns of duplicate events in production
Issue | Cause | Solution |
---|---|---|
Duplicate records in database | Not checking for existing records before insert | Implement unique constraints and check before insert |
Missing webhook events | Treating duplicates as errors | Log duplicates but don’t fail the webhook response |
Inconsistent data state | Processing events out of order | Use timestamps to ensure correct ordering |
API rate limits from retries | Not caching successful responses | Implement response caching with appropriate TTL |
Important consideration
Always respond with a 2xx status code to webhook requests, even for duplicates. Returning error codes might cause Tallyfy to retry the webhook, creating more duplicates.
After implementing idempotency handling:
- Monitor your integration logs to identify patterns of duplicate events
- Adjust your deduplication window based on actual retry patterns
- Consider implementing more sophisticated event sourcing if you need full audit trails
- Review Tallyfy’s webhook documentation for the latest payload formats
Open API > Integrate with Tallyfy using API
Webhooks > Use Tallyfy's webhooks feature
- 2025 Tallyfy, Inc.
- Privacy Policy
- Terms of Use
- Report Issue
- Trademarks