The robots.txt file is a standard text file that tells search engine crawlers which pages they can access, scrape, and ultimately list in their search engine results.
The file is listed on the root URL of your site, https://domain.com/robots.txt, and can easily be added to your Django project.
Be sure to use the correct casing and spelling for anything added to the robots.txt file or it will not work correctly.