Skip to content

Instantly share code, notes, and snippets.

@ryanaslett
Last active December 25, 2015 00:49
Show Gist options
  • Save ryanaslett/76b585afcea95a672cf1 to your computer and use it in GitHub Desktop.
Save ryanaslett/76b585afcea95a672cf1 to your computer and use it in GitHub Desktop.
This is a snippet of code that prevents bots from POST'ing if they do not appear to behave like humans. It assumes that a human browser would download assets like javascript and css and sets a cookie when they do. If a cookie is not returned, POST is denied.
# If they are requesting resources, then they're probably not bots.
RewriteCond %{REQUEST_FILENAME} \.(gif|png|jpe?g|ico|swf|js|css)$ [NC]
RewriteRule .* - [L,co=dude:abides:%{HTTP:Host}:86400]
# Check if this is a post method,
# If so, the human cookie must be set.
# If the dudes dont abide, they get a 403 for their POST.
RewriteCond %{REQUEST_METHOD} =POST
RewriteCond %{REQUEST_URI} !.*trackback/?$ [NC]
RewriteCond %{REQUEST_URI} !=/index.php [NC]
RewriteCond %{HTTP_COOKIE} !^.*dude.*$ [NC]
RewriteRule .* - [F]
@ryanaslett
Copy link
Author

Typically you would want to choose an explicit resource like somerandomname.png. This should be unique to your site, and should be explicitly non-cacheable (it should bypass cloudflare/varnish/etc so that humans always get their cookie)

If you really want to get clever and force the bots into doing extra work, you could embed another explicitly non-cached resource as a resource inside of an svg file, as svg allows for embedded images.

Basically this would force bots to A. request all of your resources, slowing them down. B. Parse those resources, slowing them further.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment