Serving robots.txt in Spring Boot application

What is robots.txt file?

The robots.txt file is a text file standard widely used in websites. Basically it tells search engine robots and other crawlers which pages and areas on your websites are allowed/disallowed for them to follow.

Serving robots.txt in Spring Boot

Depending on your project configuration there are couple of ways to serve robots.txt file using Spring Boot. In my case I was using Thymeleaf as default template engine but for robots.txt serving you’ll need to create simple method in your controller which serves plain text response.

Let’s say we don’t want robots to visit our /admin path. So we need to specify it in our robots.txt file.

You can do it like this:

User-agent: *
Disallow: /admin

One of the quickest way to serve robots.txt file is to create a method in your controller class which returns a String object with proper content. Please note that your robots.txt file should be placed in root path of your website.

    @RequestMapping(value={"/robots.txt", "/robot.txt"})
    @ResponseBody
    public String getRobotsTxt() {
        return "User-agent: *\n" +
                "Disallow: /admin\n";
    }
robots.txt method

Now, when you try to hit your path you can see the following result:

Robots.txt served from Spring Boot app

Improvements

You can improve this solution using String object annotated with @Value annotation in your controller so you can inject your robots.txt content from application.resources configuration file. For sure this approach is easier to maintain in the future.

 

Leave a Reply

Your email address will not be published.