Not indexing parts of HTML

Is there a way to limit HTML indexing to make page content more relevant? For instance. excluding menu, etc. from robots. I remember some special tags for this for a long time, but I could no longer find the information.

How do search engines support such methods (Google / Bing)?

+4
source share
2 answers

[include here Nicholas Wilson’s answer]

The only hint I found is "The Russian search engine Yandex introduces a new tag that only prevents indexing of the content between the tags, and not the whole web page."

http://en.wikipedia.org/wiki/Noindex#Russian_version

+1
source

Put trash at the bottom of the page. Also, not so much. Submit the same content to search for spiders and browsers, and they will work. There is no magic markup (which search engines use) that does what you want (although, at least use @role and nav if you want). Menus often go after the main content, and this often helps.

+1
source

All Articles