Skip to content

Instantly share code, notes, and snippets.

@stansidel
Created June 25, 2013 10:52
Show Gist options
  • Save stansidel/5857605 to your computer and use it in GitHub Desktop.
Save stansidel/5857605 to your computer and use it in GitHub Desktop.
A command to copy a default robots.txt to all sites missing it
find /www/ -mindepth 1 -maxdepth 1 -type d '!' -exec test -e "{}/robots.txt" ';' -print0 | sudo xargs -0 -I target_folder cp /www/0default/robots.txt target_folder/ --preserve=mode,ownership
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment