Robots.txt: prohibit subdirectory but allow directory

I want to enable file scanning in:

/directory/ 

but not file scanning in:

 /directory/subdirectory/ 

The correct robots.txt instruction:

 User-agent: * Disallow: /subdirectory/ 

I am afraid that if I forbid / directory / subdirectory / that I would forbid scanning all files in / directory / that I do not want to do, so I will fix the use:

 User-agent: * Disallow: /subdirectory/ 
+7
source share
2 answers

You overdid it:

 User-agent: * Disallow: /directory/subdirectory/ 

is correct.

+7
source
 User-agent: * Disallow: /directory/subdirectory/ 

Spiders are not stupid, they can make out the way :)

+2
source

All Articles