The Robots.txt file is used to control crawling by search engines. It can restrict crawlers to certain pages and directories and block all robots from indexing your site. Generally, Google does not follow the crawl-delay directive in the robots.txt file. However, you should set the crawl rate in your Google Search Console account. If you are using a blog, you do not need to worry about the robotics.txt file.