The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. - Neilpatel
What do you mean by 'the best' robots.txt file? Please give more details on what you need help with. If you are unsure on how to create a robots.txt please see this tutorial or use a robots.txt file generator that can be found here.
If you would like to block a page from search engines just simply add it to the 'Disallow: ' part of your robots.txt file. Example:
User-Agent: *
Disallow: /admin
Disallow: /example
Keep in mind some search engines/bots may just ignore this file but it's still best to have it. If you would like to learn more about the robots.txt file just do a Google search for "robots.txt file and how it works".
Hope I was of help.