I use celery tasks with external modules that send information with print() to stdout. How to save all prints of celery task to predefined file? Ideally, if each task has its own separate file.
Example main.py
for running redis on localhost:6379
:
import time
from celery import Celery
def long_run_func():
print('>>> Start running long_run_func()')
time.sleep(5)
print('>>> End running long_run_func()')
celery = Celery('celery_task', broker='redis://localhost:6379')
@celery.task(name="long_run_celery_task")
def long_run_celery_task():
long_run_func()
long_run_celery_task.delay()
Now run Celery worker:
celery -A main:celery worker --loglevel=INFO
And if you will run main.py
you can see in celery lines from long_run_func() prints:
[2024-01-11 17:30:52,746: WARNING/ForkPoolWorker-7] >>> Start running long_run_func()
[2024-01-11 17:30:57,751: WARNING/ForkPoolWorker-7] >>> End running long_run_func()
It possible to setup @celery.task
to dump all this logs to some string var or file? I mean, if I have several such tasks and they run at the same time, then be able to separate these outputs depending on the task, and not mix everything into a single log.
2
Answers
-f --logfile
command option to dump all logsto the file you want.
--help
, such ascelery --help
simple but dirty way
better way
logger
by name and sethandler
print
, you can use:All you have to do is writing a logger class and redirecting the std out and error file descriptors to that class. I am using pandas lib for time so I recommend it.
code as follows ->