What is a robots.txt file?

Urged by a search engine expert colleague, Doug Karr of Compendium Blogware, I did some investing on the robot.txt file and how it could help SEO. Here’s a good summary:

What is a robots.txt file?
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. https://www.robotstxt.org/faq.html

Doug also said to create a site map page and point to it from the robots file. This will help Google find new pages and index them accurately and timely.

What are Sitemaps?
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

More info (source): Sitemaps.org

Doug says on his site about sitemaps: “This may be the most important thing you can do for your site!”

Thanks Doug for your advice.
Tatum Hindman

About the author | Tatum Hindman

Tatum is the president of TBH Creative and is responsible for building long-term client relationships. She enjoys the strategy behind web design and collaborating with clients to define and execute online marketing goals. She likes to blog about hot topics in web design and digital marketing, as well as share tips for strengthening your online presence.

View more posts by Tatum Hindman

Previous Leo Brown Group Projects Complete Next Web usability tips everyone should know

Receive articles in your inbox