When the new website is launched, you should first think of whether the search engines have indexed your site or not. If they don't then your site may not appear in search results position.
So, let’s check about it.
Then the second thing is to look at Google webmaster tools report. If there are any crawl errors in the site then it displays in the dashboard. The 404 error which shows a page link cannot be found. If there are 301 errors in your site, then the site is not projected to your visitors.
Disallow
So, let’s check about it.
Identifying Crawling Problems
The very first step of identifying is to type in search box (site: example.com) and check how many pages you have and how many of them were indexed. Comparing with your actual pages, if Google is showing low amount of pages then your site is in problem.Then the second thing is to look at Google webmaster tools report. If there are any crawl errors in the site then it displays in the dashboard. The 404 error which shows a page link cannot be found. If there are 301 errors in your site, then the site is not projected to your visitors.
Fixing Crawling Errors
Here are some types of Issues caused by the following reasons:Robots.Txt
The Robot.txt is a text file in websites which allows search engine spiders/bots to crawl or index a website. It appears in the root of your website. The robots.txt file works for specific bots to allow or hide the directories or files within the robots.txt file.Structure of Robots.txt file
User Agent: *Disallow
No comments:
Post a Comment