Insufficient indexation of the site is a typical problem for ecommerce stores.
Site owners are struggling to get even a small number of new visitors to their store, while the fundamental problem is that search engines simply are not able to index store pages because they can’t be crawled. As the result, this leads to lower positions within SERPs and therefore almost no traffic.
Identifying blocked resources with Google Webmaster Tools
Robots.txt is a very powerful tool to manage which pages of your site Search Engines can access and vice versa. Imprudent usage of robots.txt file directives may lead to serious problems, including exclusion of the whole site from the search results, to help site owners and webmasters quickly see which pages are closed to Google bots, the “Blocked Resources Report” has been created.
A new report was announced to webmasters and developers on March 11th, so this is quite new. Here is how it looks like:
In this report you will find a list of unreachable pages from the Google bot resources on your website. Each line is clickable and contains a list of resources on a certain host as well as pages of your site from which they are detected. Also the system will offer you the possible options to unblock these resources.
Considering the fact that Webmasters do not always have an opportunity to influence the behavior of external resources, Google shows only those hosts that can be changed by Webmasters. In the list of “problem areas” you will not see URLs of external resources, such as popular web analytics services.
Fetch as Google: two screenshots instead of one
We use browsers to understand how our sites are displayed for users. But how do search engine bots see them? In quite a different way in fact. They cannot scan content and links fully if they are concealed in images, style sheets or scripts. So this part of site won’t be included in the Google index.
Google has updated the “Fetch as Google” function to address this specific issue. Before Google Webmaster Tools use to show you how your website was seen by a Google bot only, now you receive two screenshots: one with the result of the rendering of the page by search engines and another for the site visitors. Comparing these images, you can easily identify potential problems.
For ecommerce this is a great feature, because invisible to Google bot parts can be identified at the early stages of site launch and help site owners avoid extra expenses on additional feature implementation and removal when completed.
International Online Marketing Consultant at Promodo
Stan’s expertise covers search engine optimization, search engine marketing, website analytics and competitors analysis. He has worked with over 50 international projects in multiple niches including ecommerce stores, gambling projects, software companies and massive ads and Q/A portals. Stan is an experienced speaker and a reader at online marketing schools since 2014.