Limit robot access to (specific) query string (parameter) values?

Using robot.txt allows you to restrict access to robots for (specific) query string (parameter) values?

t

http://www.url.com/default.aspx #allow http://www.url.com/default.aspx?id=6 #allow http://www.url.com/default.aspx?id=7 #disallow 
+3
source share
2 answers
 User-agent: * Disallow: /default.aspx?id=7 # disallow Disallow: /default.aspx?id=9 # disallow Disallow: /default.aspx?id=33 # disallow etc... 

You only need to specify a URL that is prohibited. Everything else is allowed by default.

+6
source

Can only a definable query variable e.g.

Deny: /default.aspx? Id = *

or better yet

Deny: /? id =

0
source

All Articles