Tuesday, January 29, 2013

8 Reasons Why Your Site Might Not Get Indexed

When the new website is launched, you should first think of whether the search engines have indexed your site or not. If they don't then your site may not appear in search results position.
So, let’s check about it.

Identifying Crawling Problems 

The very first step of identifying is to type in search box (site: example.com) and check how many pages you have and how many of them were indexed. Comparing with your actual pages, if Google is showing low amount of pages then your site is in problem.
Then the second thing is to look at Google webmaster tools report. If there are any crawl errors in the site then it displays in the dashboard. The 404 error which shows a page link cannot be found. If there are 301 errors in your site, then the site is not projected to your visitors.

Fixing Crawling Errors

Here are some types of Issues caused by the following reasons:

Robots.Txt

The Robot.txt is a text file in websites which allows search engine spiders/bots to crawl or index a website. It appears in the root of your website. The robots.txt file works for specific bots to allow or hide the directories or files within the robots.txt file.

Structure of Robots.txt file

User Agent: *
Disallow

.htaccess

It is the configuration file used in web servers to control features of the server. The .htaccess file controls and redirect the pages easily. It gives control on your site, and keeps the data saved for htaccess file. So in this way you can recover from a site crash.

Meta Tags:

Meta Tags are special set of HTML tags that contains information about the webpage and its contents. Search engine robots are programmed to read meta tags regarding the web page. The main meta tags that spiders read are description and keyword meta tag. If the page is not getting indexed, check the meta tags in the source code.

Site Maps:

Sitemap is an XML file. It is a structure of the links of website through which a visitor can navigate to any part of website. Try to place the site map on all important pages of the website including the homepage so that it can be easily detected by the search engines and more Google Page Rank can flow to the sitemap.

Poor Internal Linking Structure:

This is very rare case but one must be aware of poor internal linking structure which can cause indexing problems. A good internal linking structure gives success to the website.

Page Rank

Number of Pages Google crawl is related to your page rank.

Connectivity or DNS Issues:

Whenever the site is under maintenance or moved to a new home, in such cases the DNS delegation of crawler access takes place.

No comments:

Post a Comment