Why Event-Driven Storage?
Traditional storage is passive—you put data in and pull data out. But modern applications need more:
- Process uploads immediately after they arrive
- Trigger workflows when files change
- Notify systems when data is deleted
- Build reactive data pipelines
ElasticLake webhooks make this easy.
How Webhooks Work
When enabled, ElasticLake sends HTTP POST requests to your endpoint whenever specified events occur in your buckets.
Supported Events
| Event | Description |
|---|---|
object:created |
New object uploaded |
object:deleted |
Object removed |
object:updated |
Object replaced or modified |
multipart:completed |
Large upload finished |
Setting Up Webhooks
Via Dashboard
- Navigate to your bucket settings
- Click "Webhooks" tab
- Add endpoint URL
- Select events to subscribe
- Save and test
Via API
import requests
response = requests.post(
'https://api.elasticlake.com/v1/webhooks',
headers={'Authorization': 'Bearer YOUR_TOKEN'},
json={
'bucket': 'lake1--pond1--my-bucket',
'endpoint': 'https://myapp.com/webhooks/storage',
'events': ['object:created', 'object:deleted'],
'secret': 'my-webhook-secret'
}
)
Webhook Payload
When an event occurs, you'll receive:
{
"event": "object:created",
"timestamp": "2025-10-30T14:22:33Z",
"bucket": "lake1--pond1--my-bucket",
"object": {
"key": "uploads/document.pdf",
"size": 1048576,
"etag": "d41d8cd98f00b204e9800998ecf8427e",
"contentType": "application/pdf"
},
"lake": "production",
"pond": "documents"
}
Verifying Webhooks
Always verify webhook signatures:
import hmac
import hashlib
def verify_webhook(payload, signature, secret):
expected = hmac.new(
secret.encode(),
payload.encode(),
hashlib.sha256
).hexdigest()
return hmac.compare_digest(f'sha256={expected}', signature)
# In your webhook handler
@app.route('/webhooks/storage', methods=['POST'])
def handle_webhook():
signature = request.headers.get('X-ElasticLake-Signature')
if not verify_webhook(request.data.decode(), signature, WEBHOOK_SECRET):
return 'Invalid signature', 401
event = request.json
# Process event...
return 'OK', 200
Use Case Examples
Image Processing Pipeline
def handle_upload(event):
if event['object']['contentType'].startswith('image/'):
# Trigger thumbnail generation
generate_thumbnails.delay(
bucket=event['bucket'],
key=event['object']['key']
)
Audit Logging
def handle_any_event(event):
audit_log.create({
'action': event['event'],
'resource': f"{event['bucket']}/{event['object']['key']}",
'timestamp': event['timestamp'],
'metadata': event
})
Data Pipeline Trigger
def handle_data_upload(event):
if event['object']['key'].endswith('.csv'):
# Trigger ETL job
start_etl_pipeline(
source=f"s3://{event['bucket']}/{event['object']['key']}"
)
Best Practices
- Respond quickly: Return 200 within 30 seconds
- Process async: Queue heavy work for background processing
- Handle retries: Webhooks retry on failure, make handlers idempotent
- Monitor delivery: Check webhook logs for failures
- Use HTTPS: Always use secure endpoints
Get Started
Webhooks are available on all plans. Check out our API documentation for complete details.
Ready to build event-driven applications? Start your free trial today.