Skip to content

Instantly share code, notes, and snippets.

@petskratt

petskratt/robots.txt

Last active Jan 27, 2020
Embed
What would you like to do?
WordPress robots.txt
# robots.txt for WordPress / v1.01 2020-01-27 / Peeter Marvet
# v1.01 - changed Crawl-delay to max allowed by Yandex e.g 2.0
#
# Uncomment and specify hostname (sitemap can be on different domain):
#Sitemap: https://example.com/sitemap_index.xml
#
# User-agents are not cumulative, e.g if a bot finds matching group * is ignored.
User-agent: *
Crawl-delay: 2.0
# All requests with parameters (search, WooCommerce filters etc):
Disallow: /*?*
# Add everything to be blocked here:
User-agent: AhrefsBot
User-agent: AhrefsSiteAudit
User-agent: SemrushBot
User-agent: MJ12bot
Disallow: /
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.