Using robot.txt allows you to restrict access to robots for (specific) query string (parameter) values?
t
http://www.url.com/default.aspx #allow http://www.url.com/default.aspx?id=6 #allow http://www.url.com/default.aspx?id=7 #disallow
User-agent: * Disallow: /default.aspx?id=7 # disallow Disallow: /default.aspx?id=9 # disallow Disallow: /default.aspx?id=33 # disallow etc...
You only need to specify a URL that is prohibited. Everything else is allowed by default.
Can only a definable query variable e.g.
Deny: /default.aspx? Id = *
or better yet
Deny: /? id =