Skip to content

Instantly share code, notes, and snippets.

@patrickdavey
Created October 11, 2013 03:14
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save patrickdavey/6929052 to your computer and use it in GitHub Desktop.
Save patrickdavey/6929052 to your computer and use it in GitHub Desktop.
simple dynamic robots.txt
require 'robots_txt_generator'
AppName::Application.routes.draw do
[snip]
match '/robots.txt' => RobotsTxtGenerator
end
class RobotsTxtGenerator
# Disallow everything for all other environments.
def self.call(env)
body = if Rails.env.production?
body = 'Sitemap: http://s3.amazonaws.com/degreestory-production/sitemaps/sitemap.xml.gz'
else
"User-agent: *\nDisallow: /"
end
headers = {
'Content-Type' => 'text/plain'
}
[200, headers, [body]]
rescue Errno::ENOENT
headers = { 'Content-Type' => 'text/plain' }
body = '# A robots.txt is not configured'
[404, headers, [body]]
end
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment