I’m using nginx and pm2 to serve a nuxt app (SSR).
I tried to use the sitemap module and the robot.txt module.
The sitemap module works fine and is accessible at /sitemap.xml
.
However the robots.txt
file is not accessible through nginx. Using nuxt build
and nuxt start
in local it works fine and I have access to /robots.txt
. But using nginx I have a 404. Not the Nuxt 404 but the nginx one. All other inexistant url show the Nuxt 404
Using Nuxt SSR and the robot module does not generate a robots.txt
file. It’s a middleware and response to the /robots.txt
url is dynamically generated on each request
Why do I have that 404 with the robot module but not with the sitemap?
Thanks for your help.
2
Answers
To quickly serve a robots.txt from Nginx without actually having access to the physical file you can define the content of the robots.txt file in the Nginx .conf file.
Allow access to all User-agents:
Disallow access to every User-agent:
I’m using Google Cloud App Engine (GAE) which also uses Nginx to host my Nuxt.js app and had the same problem as yours – I can access my sitemap.xml unlike robots.txt.
For GAE configuration I use an app.yaml file where among all configurations you can set handlers for static files. You can explicitly set that static files requests will be serve from a Nuxt’s ‘static’ folder. As always types of potentially requested files are set as a regexp that shows a list of file extentions (e.g. .(gif|png|txt)$).
In my case I have a ‘.txt’ file among other static files to serve from Nuxt’s ‘static’ folder. That’s why Nginx was looking for my robots.txt directly in the ‘static’ folder and did not give a chance for Nuxt to generate it.
So when I removed the ‘.txt’ extention everything work fine and now I can get my robots.txt.
I’m not familiar with Nginx configurating, but maybe you should check what extentions of static files it is serving and exclude a ‘.txt’ extention serving from Nuxt’s ‘static’ folder.