Google (search engine) Asp.net Page Indexing Guidelines

I am working on a college course booklet system in which I work, leaflets are stored in a database with the course_code primary key. I would like the booklets to be ideally indexed by Google, as I would, assuming I developed the system in asp.net 2.0.

I understand that part of its indexing is to pass variables in the link in my course_code case, this obviously also allows bookmarking of course booklets, which are good. What are the features of getting googlebot for trawling the system best.

+4
source share
5 answers
+3
source

If the Google bot can crawl your page and get everywhere on your site, just following the links, without filling out any forms or running any kind of JavaScript, you should be good to go.

(Disclaimer: Although I work for Google, I did not look at what the crawler does, and little is known to the public).

+1
source

One big thing is to use a URL rewriting scheme if you can avoid URLs like

http://www.yoursite.com/default.aspx?course_code=CIS612

But with re-recording, you could get something like

http://www.yoursite.com/courses/CIS612.aspx

These types of things really help, because querystrings are not all that perfect.

UrlRewriting.net is a good place to start with dubbing devices.

0
source

Drop the sitemap.xml file - they help the lots.

0
source

Use wget to crawl your site.

wget -r www.example.com 

If wget doesn't reach some of your URLs, Google is unlikely to reach them.

0
source

All Articles