How to add robots.txt file to AEM / CQ?

How to add robots.txt file to AEM server to provide rules for web crawlers in AEM?

+1
source share
1 answer

Most links to this link to implement this.

Although this may seem useful, you will notice one thing that may be a little "wrong."

Adding a robots.txt file directly to crxde causes the creation of a node file of type nt: at the root level.

So, when you click http: // localhost: 4502 / robots.txt instead of displaying the file download on the screen / browser.

This is because of the default server GET . The servlet identifies that the node type is an nt: file and sends a response with the content type as

Content-Type: application/octet-stream Content-Disposition: attachment;filename=robots.txt 

To overcome this tool, follow these steps: By doing so, you will miss the call to the default GET server and you can provide your own type of content.

  package com.hds.exp.filters; import org.apache.felix.scr.annotations.sling.SlingFilter; import java.io.IOException; import java.io.PrintWriter; import javax.servlet.FilterChain; import javax.servlet.FilterConfig; import javax.servlet.ServletException; import javax.servlet.ServletRequest; import javax.servlet.ServletResponse; import javax.servlet.http.HttpServletRequest; import org.apache.felix.scr.annotations.Properties; import org.apache.felix.scr.annotations.Property; @SlingFilter(order=1) @Properties({ @Property(name="service.pid", value="com.hds.exp.filters.RobotsFilter",propertyPrivate=false), @Property(name="service.description",value="Provides Robots.txt", propertyPrivate=false), @Property(name="service.vendor",value="DD Exp", propertyPrivate=false), @Property(name="pattern",value="/.*", propertyPrivate=false) }) public class RobotsFilter implements javax.servlet.Filter{ @Override public void destroy() { // Unused } @Override public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { HttpServletRequest httpServletRequest =(HttpServletRequest) request; if(httpServletRequest.getRequestURI().equals("/robots.txt")) { response.setContentType("text/plain"); PrintWriter writer = response.getWriter(); writer.print("User-agent: *"); writer.print("\n"); writer.print("Disallow: /"); writer.print("\n"); writer.flush(); } else { chain.doFilter(request, response); } } @Override public void init(FilterConfig arg0) throws ServletException { // Unused } } 
+1
source

All Articles