Skip to content

Instantly share code, notes, and snippets.

@hobgoblina
Last active May 31, 2024 11:34
Show Gist options
  • Save hobgoblina/c885b7f502307802f2b034fabf312223 to your computer and use it in GitHub Desktop.
Save hobgoblina/c885b7f502307802f2b034fabf312223 to your computer and use it in GitHub Desktop.
Update robots.txt github action via Dark Visitors
name: Update robots.txt
on:
schedule:
- cron: '0 0 * * 0'
workflow_dispatch:
env:
# the path to your robots.txt file
ROBOTS_PATH: robots.txt
jobs:
update-robots:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Update robot.txt
env:
# create a secret named ROBOTS in your repo's settings
# set the value to your Dark Visitors access token
API_KEY: ${{ secrets.ROBOTS }}
run: |
# for including other static stuff in your robots.txt
# cp robots-base.txt $ROBOTS_PATH
curl --location 'https://api.darkvisitors.com/robots-txts' \
--header 'Content-Type: application/json' \
--header "Authorization: Bearer $API_KEY" \
--data '{ "agent_types": [ "AI Data Scraper", "AI Assistant", "AI Search Crawler" ], "disallow": "/" }' \
> $ROBOTS_PATH
# use this instead if using the cp command above
# >> $ROBOTS_PATH
- name: Create pull request
uses: peter-evans/create-pull-request@v6
with:
branch: robots.txt-update
title: Update robots.txt
commit-message: Update robots.txt
labels: robots.txt
body: This PR was generated by the `Update robots.txt` action and contains updates to your robots.txt file, pulled from [Dark Visitors](https://darkvisitors.com/).
# reviewers: your-user-name
add-paths: ${{ env.ROBOTS_PATH }}
token: ${{ secrets.GITHUB_TOKEN }}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment