Jquery load () and SEO - did anyone get a decent answer?

Many aspects of my site are dynamic. I am using jquery.

I have a div that, when the DOM is ready, is populated with load ().

Then, if the button is pressed, using load () again, this value is replaced with another value.

This type of setup is common on my site. My main page is, in fact, a lot of dynamically loaded, updated and volatile content.

What are the implications of this for SEO?

I saw sites where each page is loaded using load () and then displayed using animation functions ... It looks awesome!

People have asked this question before, but no one answered it properly.

So any ideas? JQUERY AND SEO ??

thanks

EDIT

Very interesting points. I don’t want to overdo it with my jaascript site .. just where you need it to look good - my homepage, however, causes one concern.

Therefore, when the DOM is readY, it loads the content into a div. When you click on a tab, this content changes. IE No JS, No content.

The beauty here is for me that there is no duplicate code. Is the suggestion here that I should just β€œprint” the default content and then bind tabs to pages (with the same content) if JS is disabled. IE donates little duplicate code for SEO?

Regarding degradation, my only other place of concern is the tabs on one page. I have 3 divs containing content. On this page, two divs are hidden until a tab is clicked. I used this method before starting to play with JS. Perhaps it would be better to load () these tabs, and then bind the tab buttons to the place where the content is pulled?

thanks

+7
jquery seo load
source share
4 answers

No content downloaded using JavaScript will be crawled.

The general and correct approach is to use Progressive Enhancement : all links must be normal <a href="..."> to the actual pages, so that your site "makes sense" for the search spider; and the click() event overrides the normal functionality with load() , so regular users with JavaScript enabled will see an "improved" version of your site.

+8
source share

If your content is navigable when JavaScript is turned off, you'll be a good way to be visible to search engines.

Please note that search robots will not send any forms on your site, so if you have any elements designed to navigate between pages on your site, this content is not available for navigation by search engines.

+1
source share

Here are some suggestions on how to get Google to crawl content downloaded with ajax: http://code.google.com/web/ajaxcrawling/docs/getting-started.html

+1
source share

I am using jquery load () asynchronous loading. This greatly improves the user interface, but is not very friendly. Here is the only solution I have found so far:

  • At the first load, I do not use jquery load () and try to write a cookie with javascript.document.cookie = 'checkjs = on';

  • When loading the next page, if php script finds this cookie, it means javascript is enabled and jquery load () can be used. If there is no such cookie, then javascript is disabled (probably the spider has arrived), so jquery load () is not used.

    if (! $ _ COOKIE ['checkjs'] || $ _COOKIE ['checkjs']! = 'on') {echo 'js off, hello Google!'; } else {echo 'js enabled, can use jquery' load;;}

This way, I can be sure that most users can benefit from loading asynchronous page blocks, with the exception of the very first load. And spiders also get all the content.

In your case, you can simply load the same page with a new parameter that activates another tab. The spider will be happy.

0
source share

All Articles