Skip to content

Instantly share code, notes, and snippets.

@tymat
Forked from timcheadle/README.md
Created March 21, 2016 03:12
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save tymat/22083fae1038dac7dd88 to your computer and use it in GitHub Desktop.
Save tymat/22083fae1038dac7dd88 to your computer and use it in GitHub Desktop.
Make /robots.txt aware of the Rails environment

Make /robots.txt aware of the Rails environment

You probably don't want Google crawling your development staging app. Here's how to fix that.

$ mv public/robots.txt config/robots.production.txt
$ cp config/robots.production.txt config/robots.development.txt

Now edit config/routes.rb to add a route for /robots.txt, and add the controller code.

def robots
robots = File.read(Rails.root + "config/robots.#{Rails.env}.txt")
render :text => robots, :layout => false, :content_type => "text/plain"
end
# (moved from public/robots.txt)
#
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
User-Agent: *
Disallow: /
# (moved from public/robots.txt)
#
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-Agent: *
# Disallow: /
get '/robots.txt' => 'home#robots'
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment