I just came across a robots.txt file that looks like this:
User-agent: * Disallow: /foobar User-agent: badbot Disallow: *
After only a few folders are allowed for everyone, is only the badbot rule badbot ?
Note. This question is only for understanding the above set of rules. I know that using robots.txt is not an appropriate security mechanism, and I do not use or protect it.
user857990
source share