skip to Main Content

I am testing some stuff where I have to init my Postgres DB DDL into airflow Postgres DB when I compose-up it should automatically init for one time as it will be cached afterward as airflow DB works usually. Thanks

2

Answers


  1. Chosen as BEST ANSWER

    I have found a solution that works and init your scripts when you docker composed up.

    pro TIP: If you want to add more files and you have already init the airflow DB or your DB what you can do is docker-compose down --volume what this will do will automatically remove all the data in the data directory. and for init to work Postgres data directory have to be empty

    postgres:
        image: postgres:13
    
        environment:
          POSTGRES_USER: airflow
          POSTGRES_PASSWORD: airflow
          POSTGRES_DB: airflow
    
        ports: 
        - "5432:5432"
    
        volumes:
          - postgres-db-volume:/var/lib/postgresql/data
          - /path/to/my/host/folder/filename.sql:/docker-entrypoint-initdb.d/filename.sql
    
        healthcheck:
          test: ["CMD", "pg_isready", "-U", "airflow"]
          interval: 5s
          retries: 5
        restart: always
    
    volumes:
      postgres-db-volume:
    

  2. As requested in the last comment: Adding your own database to the Airflow Docker-compose file:

    Put this piece of code as a service somewhere amongst the other services:

      mypostgres:
        image: postgres:13
        environment:
          POSTGRES_USER: mydbuser
          POSTGRES_PASSWORD: securepassword
          POSTGRES_DB: mydb
        volumes:
          - ./database:/var/lib/postgresql/data
          - ./init-database.sh:/docker-entrypoint-initdb.d/init-database.sh
        restart: always
        
    

    Make sure you have a database-directory and a init-database.sh file in the current directory (otherwise the volume mappings fail)

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search