Skip to content

Instantly share code, notes, and snippets.

@mspivak
Created November 29, 2012 00:13
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mspivak/4165764 to your computer and use it in GitHub Desktop.
Save mspivak/4165764 to your computer and use it in GitHub Desktop.
Different robots.txt files for HTTP and HTTPS
RewriteEngine On
#This will have the file processed by robots.php instead of a regular static file.
RewriteRule ^robots.txt$ /robots.php [L]
<?php
header("Content-Type: text/plain; charset=utf-8");
$protocol = $_SERVER['HTTPS'] ? 'HTTPS' : 'HTTP';
echo file_get_contents(dirname(__FILE__).'/robots.txt.'.strtolower($protocol));
User-agent: *
Disallow:
User-agent: *
Disallow: /
@mspivak
Copy link
Author

mspivak commented Nov 29, 2012

This tells robot crawlers to read every HTTP path and none HTTPS.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment