celery Flashcards
celery: A broker is
the url to your message/task queue e.g. redis, rabbitmq
note: for rabbitmp it starts like broker=”amqp://…”
note: the url that works locally is amqp://localhost//
port 5672 is the default
celery: To send a function call to your message queue, so celery will execute it, type
function_name.delay(“param_1”)
celery: The basic flow is
You initialize the app = Celery(‘celery_file_name’, broker=’broker/url’) You create a function and use the @app.task decorator to give your function a .delay(‘param_1’) method. Then call that method to send the method name and parameters to your broker, and then celery listens to the broker and executes the function.
note: from celery import Celery
celery: To run functions from your celery task file from a python shell, type
from celery_tasks_file import *
function_name.delay(“param_1”)
celery: To create a celery task, type
@app.task def function_name(param_1): return param_1
celery: To start up a celery process that begins listening for messages from the broker, type
celery -A file_name worker
note: add –loglevel=info to the end to see live logging
This will start a process that listens for broker messages and runs the functions that are triggered
celery: To add a results backend to your celery instance, type
app = Celery(“celery_file_name”, broker=”amqp://…”, backend=”db+postgresql://…/database_name”)
note: After you send a message, it will automatically create the tables. People usually use redis or mongo for the backend.
celery: Adding a results backend to celery allows you to
check the status of a task instance and return the result of a task instance
result_var = function_name.delay(“param_1”)
result_var.get() # returns result
result. ready() # returns True if the task has finished, false if the task is running or hasn’t begun
result. status # returns ‘PENDING’ or ‘SUCCESS’
celery: For periodic tasks, celery requires
both a worker process and a beat process
celery: To start a celery beat process, type
celery -A celery_fil_name beat -l info
celery: To run a task every 10 seconds, type
into celery_task_file.py file
@app.on_after_configure.connect def setup_periodic_tasks(sender, **kwargs): sender.add_periodic_task(10.0, task_function_name.s('param_1'), name='any_periodic_task_name')
then run the worker process and the beat process
note: You can put several sender.add_periodic_task calls in the same setup_periodic_tasks function
celery: To run a task at a specific time every week, type
from celery.schedules import crontab
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(
crontab(hour=7, minute=30, day_of_week=1),
task_function_name.s(‘param_1’),
)
celery: the purpose of the worker process is to
listen for messages from the broker and execute the functions when asked
celery: to run rabbitmq type
/usr/local/sbin/rabbitmq-server
celery: To add a celery worker and beat process to heroku, add
to procfile
worker: celery worker –app=app_name.celery_tasks.app
beat: celery beat –app=app_name.celery_tasks.app
note: app_name might mean stub name