Robots.txt with Jigsaw
blogSetting up a robots.txt can be helpful to help search engines what to crawl on your website and what not.
Setting it up within Jigsaw is fairly simple.
This is how I did it
Add a robots.txt file to sources/assets/ with the desired content, I have following in mine
User-Agent: *
Allow: /
Sitemap: https://blog.tomasnorre.dk/sitemap.xml
To make sure this is getting published with I have added a small copy instruction to my npm run prod-build step.
"scripts": {
"prod": "mix --production && cp source/assets/robots.txt build_production/robots.txt"
},
This ensures that the robots.txt file gets copied to the build_production-folder which is the one that I publish at the end.
Reflection
I honestly don't know if this is the most elegant way to solve this, but at least it's simple. I would be happy to hear, if you have better suggestions.
Hire Me?
I work as a freelancer in my company 7th Green, specializing in PHP development and DevOps. My main strengths include TYPO3, PHP in general, DevOps and Automation.
Please reach out, I will be happy to talk about your project.