I have a website with a huge number (well, thousands or tens of thousands) of dynamic URLs, as well as several static URLs.
In theory, due to some tricky SEO binding on the main page, any spider can crawl the site and detect all dynamic URLs using a convenient search spider.
With that in mind, do I really have to worry about putting in the effort to create a dynamic sitemap index that includes all of these URLs, or do I just have to make sure all the main static URLs are there?
This actual way in which I will generate this is not a concern - I just questioned the need to do this.
In fact, the Google FAQ (and yes, I know that they are not the only search engine!) Recommends including URLs in the site map, which cannot be detected by crawling; based on this fact, if each URL of your site reaches a different one, probably the only URL that you really need as a baseline in your site map for a well-designed site is your home page?
seo sitemap
Andras zoltan
source share