I’m trying to run a local Postgres database with the PostGIS extension, and then populate the database with a shapefile, without having to load the data in manually.
I have a docker-compose.yml
where I’m first using the postgis docker image and using port 5432. I’ve already tested this when the container is running and I can successfully connect.
The second part, import-shapefile
uses the postgres image and installs the needed extensions, converts the shapefile and should add the results into the db running in the postgis container.
docker-compose.yml:
services:
postgis:
image: postgis/postgis
restart: always
env_file:
- .env
ports:
- 5432:5432
platform: linux/amd64
volumes:
- ./dataset:/dataset
import-shapefile:
image: postgres
depends_on:
- postgis
volumes:
- ./dataset:/dataset
entrypoint: sh
command: -c "apt-get update && apt-get install -y postgis && shp2pgsql -s 4326 -I -D -W UTF-8 dataset/my_shapefile.shp my_table | psql -U username -d my_db"
environment:
- POSTGRES_USER
- POSTGRES_DB
When looking at the container logs I see the error in the import-shapefile container:
sql: error: connection to server on socket "/var/run/postgresql/.s.PGSQL.5432" failed: No such file or directory
Is the server running locally and accepting connections on that socket?
Does anyone know how to resolve this within the docker-compose.yml file?
2
Answers
There is an easier way to achieve this using merely one container. You can use Kartoza’s famous PostGIS image which comes with some handy features such as loading scripts when the container starts. The image also has shp2pgsql installed. Read about it here.
your postgis host is the name of the postgis service, hence
postgis
you need to add it as environment variable to your second container or use
psql
client inside the container such as:psql -h $POSTGRES_HOST -d $POSTGRES_DB -U $POSTGRES_USER