Urged by a search engine expert colleague, Doug Karr of Compendium Blogware, I did some investing on the robot.txt file and how it could help SEO. Here’s a good summary:

What is a robots.txt file?
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. https://www.robotstxt.org/faq.html

Doug also said to create a site map page and point to it from the robots file. This will help Google find new pages and index them accurately and timely.

What are Sitemaps?
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

More info (source): Sitemaps.org

Doug says on his site about sitemaps: “This may be the most important thing you can do for your site!”

Thanks Doug for your advice.