skip to Main Content

Stack:

Django 3.0.2
python 3.8.1
celery 4.4.0
redis 3.2.0

command to start celery: celery -A app_project worker -l info

I am using celery to run background tasks in my django project. On development machine it was running perfectly without any error. Both development and production are running on the same stack; I have checked and matched them manually. Yet, still I am facing this issue.

Problem:
Celery is throwing the below error when trying to do a query on the database.

Traceback (most recent call last):
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/celery/app/trace.py", line 385, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/celery/app/trace.py", line 650, in __protected_call__
    return self.run(*args, **kwargs)
  File "/webapps/app/backend/app/tasks.py", line 22, in task_assign_photo_to_dish
    if dishq.exists():
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/models/query.py", line 777, in exists
    return self.query.has_results(using=self.db)
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/models/sql/query.py", line 537, in has_results
    return compiler.has_results()
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1114, in has_results
    return bool(self.execute_sql(SINGLE))
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1142, in execute_sql
    cursor = self.connection.cursor()
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/utils/asyncio.py", line 26, in inner
    return func(*args, **kwargs)
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/backends/base/base.py", line 260, in cursor
    return self._cursor()
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/backends/base/base.py", line 238, in _cursor
    return self._prepare_cursor(self.create_cursor(name))
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/backends/base/base.py", line 228, in _prepare_cursor
    self.validate_thread_sharing()
  File "/webapps/app/.virtualenvs/base38/local/lib/python3.8/site-packages/django/db/backends/base/base.py", line 553, in validate_thread_sharing
    raise DatabaseError(
django.db.utils.DatabaseError: DatabaseWrapper objects created in a thread can only be used in that same thread. The object with alias 'default' was created in thread id 139987604641728 and this is thread id 139987115662208.
[2020-02-05 23:04:51,621: ERROR/MainProcess] Signal handler <bound method DjangoWorkerFixup.on_task_postrun of <celery.fixups.django.DjangoWorkerFixup object at 0x7f515c6ebd30>> raised: DatabaseError("DatabaseWrapper objects created in a thread can only be used in that same thread. The object with alias 'default' was created in thread id 139987604641728 and this is thread id 139987115662208.")

update

@shared_task()
def task_assign_photo_to_dish(id):
    dishq = Dish.objects.filter(pk=id)
    if dishq.exists():
        dish = dishq[0]
        DishFile.objects.filter(dish=dish).delete()
        pq  = Post.objects.filter(dish=dish).annotate(c=Count('liked_by')).order_by('-c')
        for p in pq[:5]:
            instance = DishFile.objects.create(dish=dish, file=p.post_files.all().order_by('?')[0].file)
    return "Done"

4

Answers


  1. Are you on on Windows?

    I had the same problem, but once I deployed to heroku(which runs on Linux), it worked.

    Login or Signup to reply.
  2. The best practice is build a dict in the view and pass it to the task as parameter

    Login or Signup to reply.
  3. use this command and it should work on windows

    celery -A your_project.celery worker --loglevel=info --pool=solo
    
    Login or Signup to reply.
  4. use thread pool in celery command

    celery -A uploader worker  -l warning -n p80 --pool=threads --concurrency=100
    

    FYI- if you will not specify any pool by default celery will take prefork

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search