Skip to content

Instantly share code, notes, and snippets.

View ask's full-sized avatar
🎯
Focusing

Ask Solem ask

🎯
Focusing
View GitHub Profile
* If you don’t care about the results of a task, be sure to set the ignore_result option, as storing results wastes time and resources:
@celery.task(ignore_result=True)
def mytask(...)
something()
Results can even be disabled globally using the CELERY_IGNORE_RESULT setting.
* Or instead of routing it you could rate limit the task instead, so that only 10 tasks of this type can be processed in a minute (10/m):
@rca
rca / tasks.py
Created March 27, 2012 08:14
break a big celery job into smaller, batched, chunks
"""
Celery tasks that batch a job with many tasks into smaller work sets.
The problem I'm attempting to solve is one where a job comprised of many
tasks (say 100) will snub out a job comprised of only a few tasks (say 5). It
appears as though by default celery will queue up the second job's 5 tasks
behind the first job's 100 and it will have to wait until the first job's
completion before it even begins.