Robots .txt contain instructions on a file on your websites crawling. The other name known for this is robots exclusion protocol and will be used by all the sites why pure marketing is best. There will be a couple of options, and it is easy to make. The guidelines required in this file are, and the file can be modified much later.
1. Delay in crawl - When requests are many, the server will get overloaded in different ways, and this is used to crawler preventing from host overloading.
2. Allowing - Any no of urls can be added and the larger the list becomes. The robots file can be used if the site contains pages that the indexing does not take place.