How to stop bots from scanning my AJAX URLs?

I have several pages on my ASP.NET MVC 3 website (not regarding this technology) where I display a specific URL in a tag <script>on a page so that my JavaScript (saved to an external file) can execute AJAX calls to the server.

Something like that:

<html>
   ...
   <body>
      ...
      <script type="text/javascript">
         $(function() {
            myapp.paths.someUrl = '/blah/foo'; // not hardcoded in reality, but N/A here
         });
      </script>
   </body>
</html>

Now on the server side, most of these URLs are protected by attributes indicating that:

a) Only AJAX can access them (e.g. XmlHttpRequest)

b) They can only be accessed through HTTP POST (since it returns JSON - security)

The problem is that for some reason, bots crawl these URLs and try to perform HTTP GET on them, resulting in 404.

, javascript. , URL?

?

URL , , , URL- ( ).

HTTP 410 (Gone), URL ( AJAX POST). , .

/?

+5
1

URL- robots.txt

+2

All Articles