All Collections
Robots.txt File
Robots.txt File
Updated over a week ago

We suggest creating a robots.txt file in the web root of the domain to address two issues.

  • Control the rate at which the website is being crawled which can help prevent a bot/spider from creating a massive number of database connections at the same time.

  • To prevent specific bots from crawling the website.

We use the following defaults, however, you may want to add or remove the user agents denied and adjust the crawl rate.

User-agent: *
Crawl-delay: 2

User-agent: Baiduspider
Disallow: /

User-agent: Sosospider
Disallow: /

Additional Resources

The links below will provide additional information, as well as instructions for more refined usage of the robots.txt

Did this answer your question?