Robots.txt with Jigsaw
Tomas Norre •
Setting up a robots.txt
can be helpful to help search engines what to crawl on your website and what not.
Setting it up within Jigsaw
is fairly simple.
This is how I did it
Add a robots.txt
file to sources/assets/
with the desired content, I have following in mine
User-Agent: *
Allow: /
Sitemap: https://blog.tomasnorre.dk/sitemap.xml
To make sure this is getting published with I have added a small copy instruction to my npm run prod
-build step.
"scripts": {
"prod": "mix --production && cp source/assets/robots.txt build_production/robots.txt"
},
This ensures that the robots.txt
file gets copied to the build_production
-folder which is the one that I publish at the end.
Reflection
I honestly don't know if this is the most elegant way to solve this, but at least it's simple. I would be happy to hear, if you have better suggestions.
If you find any typos or incorrect information, please reach out on GitHub so that we can have the mistake corrected.