This is a mixed bag. From some tests, I saw that GoogleBot can index some of the downloaded AJAX content in some cases. It's a safe bet, however, so that all search engines are happy to use prerender.io or download their open source content (uses PhantomJS) so that your site is easily indexable. It basically saves the version of your site after completing asynchronous operations for a given URL, and then you set up a redirect to your server, which points to any potential search engine bots on a pre-processed page. It sounds pretty complicated, but following the instructions on the site is not that hard to set up, and if you don't want to pay for prerender.io to serve cached copies of your pages in search engines, you can also run the server component.
source share