I have a server (Ubuntu Server) on the local network on ip address: 192.168.1.9.
This server is running RabbitMQ in docker.
I defined a basic Celery app:
from celery import Celery
app = Celery(
'tasks',
brocker='pyamqp://<username>:<password>@localhost//',
backend='rpc://',
)
@app.task
def add(x, y):
return x + y
Connected on the server I run the script with celery -A tasks worker --loglevel=INFO -c 2 -E
On my local laptop in a python shell I try to execute the task remotely by creating a new Celery instance with this time the ip address of my remote server.
from celery import Celery
app = Celery(
'tasks',
brocker='pyamqp://<username>:<password>@192.168.1.9//',
backend='rpc://',
)
result = app.send_task('add', (2,2))
# Note: I also tried app.send_task('tasks.add', (2,2))
And from there nothing happen, the task stay PENDING
for ever, I can’t see anything in the logs, it doesn’t seem the server picks up the task.
If I connect to the server and run the same commands locally (but with localhost
as the address) it works fine.
What is wrong? How can I send tasks remotely?
Thank you.
2
Answers
Actually there was just a typo,
brocker
argument instead ofbroker
.The task name is your celery app module’s path + task name because you put it in that file.
Or you can start your worker with the
DEBUG
log, which will list all registered tasks:It should be
But IMO you should use some API like https://flower.readthedocs.io/en/latest/api.html to have a more stable API.