Robots.txt is a file that is generally placed in the root folder of a website. The role of this file is to help search engines to give an accurate position for a website on the Search Engine Results Page. Google and other search engines use robots or website crawlers, which go through and review all the content present in a website.
If you want to exclude some pages in a website such as admin pages, you can very well do that. The Robots Exclusion Protocol ensures that robots.txt files exclude pages. This free robot.txt generator tool will easily generate the file for you based on the pages you want to exclude.