skip to Main Content

I’m having an issue setting up celery to work with my flask app. I’ve used a barebones app to test the configuration and have found that my celery worker is started but not picking up any of the tasks like in all the tutorials. Basically, when you call the .delay() function it is supposed to take your python function and send it to celery to process in the background but instead things hang because a connection could not be made. So possibly my configuration is incorrect or there is a bug in one of the versions of software I have downloaded that I am unaware of.

Here’s the contents of my requirements.txt file:

amqp==5.1.0
anyjson==0.3.3
async-timeout==4.0.2
beautifulsoup4==4.10.0
billiard==3.6.4.0
celery==5.2.3
cffi==1.15.0
click==8.0.4
click-didyoumean==0.3.0
click-plugins==1.1.1
click-repl==0.2.0
colorama==0.4.4
Deprecated==1.2.13
Flask==2.0.3
Flask-SQLAlchemy==2.5.1
greenlet==1.1.2
itsdangerous==2.1.2
Jinja2==3.1.1
kombu==5.2.4
MarkupSafe==2.1.1
packaging==21.3
prompt-toolkit==3.0.28
pycparser==2.21
pyparsing==3.0.7
pytz==2022.1
redis==4.2.0
six==1.16.0
soupsieve==2.3.1
SQLAlchemy==1.4.32
typing_extensions==4.1.1
vine==5.0.0
wcwidth==0.2.5
Werkzeug==2.0.3
wrapt==1.14.0
yahoofinancials==1.6

Here’s tasks.py. Note the commented out line because for some reason the celery worker doesn’t launch properly without the backend specified which is also weird.

from celery import Celery
from time import sleep

#app = Celery('tasks', broker='redis://localhost:6379')
app = Celery('tasks', backend='redis://localhost', broker='pyamqp://localhost')

@app.task
def add(x, y):
    return x + y

@app.task
def reverse(myString):
    sleep(5)
    return myString[::-1]

The celery app starts fine in the virtual environment:

C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestAppprojectFiles>..Scriptsactivate

(testApp) C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestAppprojectFiles>celery -A tasks worker --loglevel=INFO

 -------------- celery@DESKTOP-GHMPTB0 v5.2.3 (dawn-chorus)
--- ***** -----
-- ******* ---- Windows-10-10.0.19043-SP0 2022-03-31 12:07:03
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x24f8cfca1a0
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     redis://localhost/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . tasks.add
  . tasks.reverse

[2022-03-31 12:07:03,550: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2022-03-31 12:07:03,565: INFO/MainProcess] mingle: searching for neighbors
[2022-03-31 12:07:04,128: INFO/SpawnPoolWorker-1] child process 240 calling self.run()
[2022-03-31 12:07:04,128: INFO/SpawnPoolWorker-4] child process 13564 calling self.run()
[2022-03-31 12:07:04,128: INFO/SpawnPoolWorker-3] child process 8584 calling self.run()
[2022-03-31 12:07:04,128: INFO/SpawnPoolWorker-2] child process 8344 calling self.run()
[2022-03-31 12:07:04,611: INFO/MainProcess] mingle: all alone
[2022-03-31 12:07:04,642: INFO/MainProcess] celery@DESKTOP-GHMPTB0 ready.

And then the results of sending the function call to celery give me a connection error. This is the part that stumps me.

(testApp) C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestAppprojectFiles>python
Python 3.10.4 (tags/v3.10.4:9d38120, Mar 23 2022, 23:13:41) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> from tasks import *
>>> result = add.delay(2,3)
Traceback (most recent call last):
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisconnection.py", line 614, in connect
    sock = self.retry.call_with_retry(
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisretry.py", line 45, in call_with_retry
    return do()
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisconnection.py", line 615, in <lambda>
    lambda: self._connect(), lambda error: self.disconnect(error)
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisconnection.py", line 680, in _connect
    raise err
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisconnection.py", line 668, in _connect
    sock.connect(socket_address)
ConnectionRefusedError: [WinError 10061] No connection could be made because the target machine actively refused it

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagescelerybackendsredis.py", line 119, in reconnect_on_error
    yield
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagescelerybackendsredis.py", line 169, in _consume_from
    self._pubsub.subscribe(key)
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisclient.py", line 1549, in subscribe
    ret_val = self.execute_command("SUBSCRIBE", *new_channels.keys())
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisclient.py", line 1390, in execute_command
    self.connection = self.connection_pool.get_connection(
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisconnection.py", line 1386, in get_connection
    connection.connect()
  File "C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestApplibsite-packagesredisconnection.py", line 620, in connect
    raise ConnectionError(self._error_message(e))
redis.exceptions.ConnectionError: Error 10061 connecting to localhost:6379. No connection could be made because the target machine actively refused it.

To confirm, I am running python version 3.10.4 which is an accepted version for celery.

(testApp) C:UsersOwnerMy DriveDocumentsScriptsvirtual_envstestAppprojectFiles>python --version
Python 3.10.4

Does anyone see what is wrong? I can’t really move forward in my real project if I can’t get background tasks to work. I’m new to celery and trying to figure it out but am willing to switch brokers or scheduling software if I cannot make this work.

2

Answers


  1. ‘connection refused’, which means there was nothing listening to the IP:port you tried to connect to. you should check your Redis instance.

    debugging tip: you can remove the backend parameter and test again if everything works well, you will be sure about having a problem with the Redis connection.

    Login or Signup to reply.
  2. try to run:

    celery -A tasks worker –loglevel=INFO -P solo

    or install eventlet

    pip install eventlet

    end, try to run:

    celery -A tasks worker –loglevel=INFO -P eventlet

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search