skip to Main Content

Airflow scheduler process crashes if we start turning on the DAG and Trigger DAG from Airflow web-server.

Airflow Version – **v1.10.4

Redis server v=5.0.7

executor = CeleryExecutor

broker_url = 'redis://:password@redis-host:2287/0'
sql_alchemy_conn = postgresql+psycopg2://user:password@host/dbname

result_backend = 'db+postgresql://user:password@host/dbname'

Crashes with Below Error message .

scheduler_job.py:1325} ERROR - Exception when executing execute_helper
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1323, in _execute
    self._execute_helper()
  File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1412, in _execute_helper
    self.executor.heartbeat()
  File "/usr/lib/python2.7/site-packages/airflow/executors/base_executor.py", line 132, in heartbeat
    self.trigger_tasks(open_slots)
  File "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 203, in trigger_tasks
    cached_celery_backend = tasks[0].backend
  File "/usr/lib/python2.7/site-packages/celery/local.py", line 146, in __getattr__
    return getattr(self._get_current_object(), name)
  File "/usr/lib/python2.7/site-packages/celery/app/task.py", line 1037, in backend
    return self.app.backend
  File "/usr/lib/python2.7/site-packages/kombu/utils/objects.py", line 44, in __get__
    value = obj.__dict__[self.__name__] = self.__get(obj)
  File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 1223, in backend
    return self._get_backend()
  File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 940, in _get_backend
    self.loader)
  File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 74, in by_url
    return by_name(backend, loader), url
  File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 54, in by_name
    cls = symbol_by_name(backend, aliases)
  File "/usr/lib/python2.7/site-packages/kombu/utils/imports.py", line 57, in symbol_by_name
    module = imp(module_name, package=package, **kwargs)
  File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
ImportError: No module named 'db

Why is the scheduler crashing when DAG is triggered?
I’ve tried running pip install DB but it didn’t solve the issue.

3

Answers


  1. As the error states. You must not have set up your db properly.

    Login or Signup to reply.
  2. Did you do

    $ airflow initidb
    

    before trying to start the webserver?
    Also, you seem to be using python 2.7, are you sure it is compatible with the latest version of airflow you are using?
    I was using python 3.5.2 with latest airflow and it did not work for me and thus I had to downgrade my airflow version a little.

    Login or Signup to reply.
  3. Airflow is not compatible with Python version 2.7
    Run airflow with python 3.6 and then create db user and grant privileges and then run the command “airflow initdb“. This will initialize your database in the airflow.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search