I needed to have a dynamic robots.txt because I was serving two different sites from one Rails application. One common case is the need to block robots in the staging environment. In this article we will set this up.

First we set up a route for the robots:


# config/routes.rb
get '/robots.:format' => 'pages#robots'

Then in the pages controller or whichever controller you used above


# app/controllers/pages_controller.rb
class PagesController < ApplicationController
  def robots
    respond_to :text
    expires_in 6.hours, public: true
  end
end

Then you create your erb/slim/haml view which is your robots.txt file:


# app/views/pages/robots.html.erb
<% if Rails.env.production? %>
  User-Agent: *
  Allow: /
  Disallow: /admin
  Sitemap: http://www.mysite.com/sitemap.xml
<% else %>
  User-Agent: *
  Disallow: /
<% end %>

Now the robots will be disallowed for the entire app unless its running in the production environment.


Related External Links: