skip to Main Content

I user nginx docker as proxy server, and there are other containers running: nuxt.js and php-fpm.

This is my fpm‘s conf file:

server {
  server_name fpm.me.com;
  root /var/www/fpm/html;

  location / {
    # try to serve file directly, fallback to index.php
    try_files $uri /index.php$is_args$args;
  }

  #location ~ ^/index.php(/|$) {
  # https://stackoverflow.com/questions/68350978/nginx-serving-only-but-not-any-other-files
  location ~ .php(/|$) {
    fastcgi_pass fpm:9000;
    fastcgi_split_path_info ^(.+.php)(/.*)$;
    include fastcgi_params;
    fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
    fastcgi_param DOCUMENT_ROOT $realpath_root;
    #internal;
  }

  location ~ .php$ {
    return 404;
  }
  access_log off;
  error_log /dev/stderr;
}

But when I type fpm.me.com, I get this error in nginx‘s log file:

2021/08/03 09:31:18 [error] 31#31: *13 FastCGI sent in stderr: "Primary script unknown" while reading response header from upstream, client: MY_IP, server: fpm.me.com, request: "GET / HTTP/1.1", upstream: "fastcgi://172.20.0.8:9000", host: "fpm.me.com"

I saw other answers in StackOverFlow, ServerFault and other websites but it seems it did not help me.

2

Answers


  1. Chosen as BEST ANSWER

    The issue was with the volumes I mounted for nginx and fpm which were not the same.

    This is my compose file:

    services:
      nginx:
        image: nginx
        volumes:
          - './fpm/:/var/www/fpm/'
      fpm:
        image: custom
        volumes:
          - './fpm/:/var/www/'
    

    While it should have been like this to work:

    services:
      nginx:
        image: nginx
        volumes:
          - './fpm/:/var/www/fpm/'
      fpm:
        image: custom
        volumes:
          - './fpm/:/var/www/fpm/'
    

  2. This doesn’t look valid:

    fastcgi_pass fpm:9000
    

    You have 2 options:

    • use a unix socket:

      fastcgi_pass unix:/run/php/php7.4-fpm.sock;

    • use TCP port

      fastcgi_pass 127.0.0.1:9000;

    Option 1 is slightly faster and doesn’t have the risk of ports exhaustion if you receive many concurrent connections.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search