A free extension that allows the user to view data collected by search engines, and receive alerts when pages were blocked.
The product is easy to use and does not require previous knowledge in SEO.
Scanning of Robots.txt file:
The search engines look for Robots.txt files in order to determine which pages not to scan.
• An alert when a page was blocked
• Last scan data
Meta Robots Tags:
An alert when there is a tag that is blocking index (No Index) or a tag that guides the search engines not to follow the links on a page (No Follow)
Checking and displaying Redirect Path includes:
• The full route
• HTTP Status
• Directing URL
website IP address:
The extension shows the IP address of the website server
No Follow Links:
This tag guides the search engines not to follow the link
• The extension scans the page and highlights in yellow all the links that contains “Rel=NoFollow” (supports pictures and non-textual elements as well)
• There is an option to prevent highlighting those links
Title,Meta Description Tag:
These tags are displayed on Google search results.
• Displaying the tags details
• Alert when the tag shows more then once
This tag prevents the same content from appearing twice
When two or more links display identical or similar content, we do not want the search engine to scan all of the duplicated pages, therefore we choose one link to present in all those pages. For example, one link classifies products by price, from lowest to highest, and the other link classifies by price, from highest to lowest. The content in that example is almost identical, so if we wish to prevent the scanning of both of them by search engines, we will choose to present on both pages only one link.