Skip to main content
All CollectionsGeneral
Robots.txt File
Robots.txt File
Updated over 2 months ago

We suggest creating a robots.txt file in the web root of the domain to address two issues.

  • Control the rate at which the website is being crawled which can help prevent a bot/spider from creating a massive number of database connections at the same time.

  • To prevent specific bots from crawling the website.

We use the following defaults, however, you may want to add or remove the user agents denied and adjust the crawl rate.
โ€‹

User-agent: *Crawl-delay: 2User-agent: BaiduspiderDisallow: /User-agent: SosospiderDisallow: /

Additional Resources

The links below will provide additional information, as well as instructions for more refined usage of the robots.txt

Did this answer your question?