React Router + AWS Backend as SEO

I am using React and React Router in my one page web application. Since I am rendering client-side rendering, I would like to serve all my static files (HTML, CSS, JS) from CDN. I use Amazon S3 to host files and Amazon CloudFront as a CDN.

When the user requests /css/styles.css, the file exists, so S3 serves it. When a user requests / foo / bar, this is a dynamic URL, so S3 adds hashbang: / #! / Foo / bar. This will serve as index.html. On my client side, I delete hashbang so that my URLs are pretty.

All this works great for 100% of my users.

  • All static files are transferred via CDN
  • The dynamic URL will be redirected to / #! / {...}, which serves as index.html (my one-page application)
  • My client side removes the hash bands so that the urls are good again.

Problem

The problem is that Google will not crawl my site. That's why:

  • Google Queries /
  • They see a bunch of links, for example. to / foo / bar
  • Google queries / foo / bar
  • They are redirected to / #! / Foo / bar (302 Found)
  • They remove hashbang and request /

Why is the hash blank deleted? My application works great for 100% of my users, so why do I need to reconfigure it to get Google to crawl it correctly? This is 2016, just follow hashbang ...

</ & bombastic GT;

Am I doing something wrong? Is there a better way to get S3 to serve index.html when it doesn't recognize the path?

Setting up a node server to handle these paths is not the right solution, because it completely destroys the purpose of the CDN.

In this thread, Michael Jackson, the main contributor to React Router, says: "Fortunately, hashbang is no longer used everywhere." How would you change my setting so as not to use hashbang?

+6
source share
3 answers

You can also check out this trick . You need to configure the distribution of the cloud network, and then change the 404 behavior in the Error Pages section of your distribution. That way you can again use the domain.com/foo/bar links :)

+6
source

I know that this has been a few months, but for those who have faced the same problem, you can simply specify "index.html" as a document with an error in S3. The error document property can be found in bucket Properties => static Website Hosting => Enable website hosting.

Please keep in mind that with this approach you will be responsible for handling Http errors, such as 404, in your own application along with other http errors.

+3
source

Hash hang is not recommended when you want to make an SEO-friendly website, even if it is indexed by Google, only small and thin content will be displayed on the page.

The best way to build your site is to use the latest trends and techniques that are “progressive web enhancements”, look for it on Google, and you will find many articles about it.

Basically, you have to make a separate link for each page, and when a user clicks on any page, he will be redirected to this page using any effect you want, or even if it is placed on one website.

In this case, Google will have a unique link for each page, and the user will have a nice effect and excellent UX.

Example:

<a href="http://www.example.com/contact-us" onclick="fancyEffects();">Contact Us</a> 
0
source

All Articles