Robots.txt Tester - Google Webmaster Tools

Steven Spencer's picture

Robots.txt Tester - Google Webmaster Tools

Google released another feature in Google Webmaster Tools that I'm very excited about! The Robots.txt Tester. Open up Webmaster Tools, then look under 'Crawl.' You will see it listed there. When I am performing an SEO audit on a website, the robots.txt is one of the first five things I check. Novice webmasters will make changes to this file attempting to sculpt which links Google should place its authority on, and in doing so block vital resources. This new tool allows us to easily check for errors per page. Its good for quick debugging on each page, but if you have 100's of pages you should get an expert to review your robots.txt file.

Why Robots.txt is important for Search Engine Optimization

Improper usage of directives in this file can result in certain pages and page components of your website being blocked. This prevents Google from indexing them. Need I say more?

Page Layout Algorithm Update and Google Panda 4.0

In early 2012, Google began releasing the first of a series of Page Layout Algo Updates. This change was meant to penalize websites with too many ads above the fold. A robots file that is blocking access to web resources, such as CSS (stylesheets), would prevent Google from determining the layout of that page. Google Panda 4 also emphasized new algo updates that are related to blocking of web resources. If your rankings were impacted on or around May 19th, 2014, you may need to take technical action to fix these issues. Give us a call and we can help you!

Fetch and Render

In addition to the robots.txt tester, there is also another button under the crawl section of Google Webmaster Tools (then select Fetch as Google), named 'Fetch and Render.' It functions allows you to not only fetch the source code of a webpage, but Render it. Rendering a webpage is the process of the user interface being compiled visually with code, graphics, and colors. i.e. Loading all the components required for the visual side someone will look at. If you use this 'Fetch and Render' tool, and Google returns a result similar to an outline, or just not what you website should look like, there is a good chance you have blocked resources.

Blocked CSS and JS Web Resources

If your results from the Fetch and Render request show 'GoogleBot couldn't get all resources for this page,' you will notice a list of resources that need to be addressed. This is a good time to look at every line of your robots file. It is ok to have NO robots file, should you should remove as many lines as possible. Take a look at the default robots file that your CMS provides (there are exceptions here, not all CMSs are created SEO friendly), and reference that. If you need some in depth technical analysis of your robots file, give us a call and we will be glad to help you out.

Start your software solution today

Contact Us

What's important to know these days?

Quick Tips and Expert App and Web Design Blog