Wednesday, January 30, 2013

Tactics To Increase Your Website Crawl Rate

Google webmaster tool enables you to view crawl status of your website. It also allows you to submit sitemap which helps Googlebot to index pages easily. Crawl stats displays graph of Googlebot crawl activity in 90 days on your site. Spider allows you to find the problems on your site during its crawl.

Crawl stats provide you the statistical data of your website with the help of Googlebot activity. By considering main factors like page rank, links, traffic and domain age Googlebot crawls your website.
Search engines index your website with the use of crawl rate. If the crawl rate of your website is low, automatically traffic of your site decreases, because search engines spend more time to index the website if the rate is low.

Few suggestions to increase crawl rate:


1.Update Content Frequently: Search engines Crawler considers unique content as a primary constraint. Though it is difficult task to update content on a regular basis but you should update your content frequently. Update your blogs couple of times in a week to get it crawled by bot more frequently. Blogs are the simplest way to produce new content frequently.
2.Optimize Load Time: Page loading time must be short because crawler spends much time on images and java scripts. If this will take more time to load, there will be no time for crawler to visit other pages. So use PNG or JPG formats which will load easily. By making use of W3 cache wordpress plugin you can easily cache a page and display it quickly to the user.

 3. Add Sitemap: Sitemap submission helps Googlebot to crawl your site easily and it also increases the crawl rate. Sitemap is the better way to crawl your new pages and updated blogs. It is easy to submit sitemap for images and videos in wordpress.

4. Get Good Page Rank: Website page rank is a main criterion that determines the crawl rate of your site. Site with high page rank gets crawled by bot regularly. So, to increase page rank of website, you have to build quality inbound links, create unique content and increase SEO score.

  5.Avoid Duplicate Content:  Website Content must be unique because most of the bloggers take content from another site and modifies it and publishes the content on their site. Google Algorithm is very strict on this aspect. If it finds the duplicate content, it penalizes and reduces the rank of your website. Duplicate content harms your crawl rate.

6. Use Ping Services: Pinging is a best way, where Goolgebot can know when your site content is updated. Make use of ping services like pingomatic and pingoat to increase crawl rate. Wordpress adds more ping services, where googlebot automatically notifies when new post is updated.

             If you want bot to visit your website on regular basis, update your website by considering above factors through which crawl rate also increases.


Search engine optimization is the process of increasing the visibility in the search results and getting the traffic from free or natural listings. Negative SEO is the complete opposite of SEO. The aim of negative SEO is to bring down your website in the search engine rankings. Competitors target your website and can affect your rankings by developing bad backlinks and using negative SEO tactics.

Negative SEO Tactics

Following  are  some  of  the negative  SEO tactics  that  your  competitors  can  use  to  bring down your  website's   ranking  on  search engine  results.

1.Paid Linking: Some of the websites which buy and sell links in order to pass page rank is a violation of Google's quality guidelines. Google announced to all webmasters to remove the paid links. The links which you have no control over or unsure about, submit a reconsideration request about all those links.

2.Stealing Content before it gets Indexed: Whenever a new content is added to your website, it is not indexed immediately. Your competitors can steal your content before it gets indexed. You can avoid this by updating the sitemap and re-submitting consistently when the new content is published in your site. You can identify whether your content or site is being duplicated by other websites or not by performing a Google search containing sentence of your page within quotes.

3.Fake Reviews: Competitor can add fake reviews to a business listing and it seems to be as if it was created by you. You can prevent this by making sure to monitor the reviews and use Report a Problem link so that Google can notify about all such reviews.

4.Site Speed: If your website is getting crawled then your site can experience loading problem. You can prevent malicious crawlers from having access to your site and allow reliable crawlers like Google, Bing to access your site. For identifying and stopping bad crawlers knowing Google and Bing's IP addresses will be useful.

5.DCMA Removal Requests: After monitoring the most valuable backlinks, a competitor can send email to the webmaster that the link to your website is causing copyright infringement and it should be taken down. You can avoid this by establishing a relationship with the website and email them thanking the sites.

6.Monitor your Backlinks Profile: Monitoring your backlink profile plays a crucial role in finding your competitors using various tools like Majestic SEO, Open Site Explorer for generating quality backlinks.

7.Monitor Your On-site SEO Activities: Apart from monitoring the backlinks your site is receiving, your competitors can also get the information about the press releases, social media optimization etc. You can check frequently on your competitors to see whether they are using any strategy to pull down your website from SERP.

Off Page Optimization Techniques

Off page optimization is a process that can be done off the page by some of the factors to improve the page ranking with targeted keywords. 

The major factors which have effect on off page optimization are listed below: 

1.Social networking
2.Directory submission
3.Article submission
4.Press release
5.Blog submission
6.Forum postings
7.Social bookmarking
8.Link popularity
9.Link exchanging
10.Link building and few more.

1. Social Networking: There are social networking sites like Facebook, Twitter, and Google+ etc. Login in to any of them and anyone can update any information in these sites. Communicating with friends can build up the promotion of website which is shared on social networks.

2. Directory Submission: Directory submission is one of the major factors which can improve page rank of a site. Submit the information related to business or services to famous directories like Yahoo, DMOZ, Pegasus, Niche etc.

3. Article Submission: Write an article with content rich and it should not be copied. In order to get quality back links and to increase the traffic make sure to get unique content. Post the articles in famous article submission sites like Ezine, Squidoo,, etc. Sign up into these sites and submit your articles.

4. Press Release: Press releasing sites are Google news, Yahoo news, PR newswire and many more which enables to reach everyone globally. Submitting in quality press releases can improve organic traffic and can get better position in search results.

5. Blog Submission: This is one of the major ways to promote your business through online. Blogs which are writing for your business promotion should be unique in content so that your blog can receive huge traffic. Submit the blogs to sites like RSS, word press, etc. through which you can also receive comments. Based on these comments, you can alter their views on blogs.

6. Forum Postings: This is an online discussion where anyone can post their views and can share with others. Posting comments with signatures should be must. This forum posting creates relationship with others views and improves the promotion of your sites.

7. Social Bookmarking: Social bookmarking is one of the methods to store the documents online. Bookmarking helps users to save the useful information at the instant and can share with others also. This increases online visibility.

8. Link popularity: Link popularity determines the number of quality back links to your site. More the number of back links linking to your site more will be the popularity. It is of two types, internal and external link popularities. Internal link popularity meant for number of links that are linking with in a site. External link popularity meant for number of links getting back to the particular site.

9. Link Exchange: Exchanging links between the mutually cooperated web site owners for the improvement of rank position of the site with huge traffic. There are three types of link exchanging which are one way, two ways linking and three way link exchanging.

10. Link Building: Building the quality links with external sites will improve the page rank by search engines. If the links have good content rich keywords then position of the site alters to a good position in search results.

How To Get Traffic From Pinterest

Pinterest is the latest social media network to people for advertising. While social network people doubt it as just another site but the reality is that it is quite different from the other similar sites. Therefore, you can use Pinterest as a free promotional site for your business.

What is Pinterest and how it is different from other social networks?

Pinterest is defined as an online virtual pinboard. It is a place where you can pin images and other interesting things you can share with the world. Compare to the other social media networks, you follow people and have followers. In addition to this, users can repin your pins or posts through which you can reach the world.

Though you can pin any type of stuff on Pinterest, but it is not suitable for all kinds of brands. It works the best if your product/service can be presented in a visual way. In this case of publicity of your product on Pinterest could get more traffic.

Tips to make the most out of Pinterest for the benefit of your business

In order to make the most out of Pinterest rather wasting your time or doing damage to your brand you need to know how to use it. Here are some tips:

Be active on Pinterest

Like other social media networks, you need to spend time on Pinterest in daily basis by following what others are pinning on it in order to get benefits from Pinterest.

Use high quality images

Pinterest is about visual appeal therefore we can't go with poor quality images. You need to use quality images or else you will be doing your product more harm than good. This doesn't mean you should hire a professional photographer but use your own skills to create the image.

Post in series

When you keep your followers interested, they will visit your board regularly. One should post in series in your board regularly to keep your users interested. For Example, if you have 10 images to post, don't post all at a time but post it one by one.

Add a Pin-It button to your site

If you want to get repins for your posts, you need to make it easy so that users can do it. One should do by adding a Pin-It button to your site which is visible to users. In this way you can increase the pins and repins for your website.

Post interesting stuff not limited to your products/services only

People are hardly visiting to your board to see your products; so you should post interesting pictures that will increase the pins. This helps to attract new visitors to your account because when they see a good image on your Pinterest, chances are there that they will visit your profile and products there. If you only post images in your posts for products, that may not attract some visitors much.

Write interesting and meaningful descriptions

Good Images are the attraction but even the most definite photo will benefit from a good description. Additionally, a good description will increase the images rank, which increases the traffic for you in search engines. Be brief and definite in your descriptions and users will surely appreciate it.

Use hashtags (wisely) and categorize your boards

When you use hashtags and categorize your pins, this makes them look more organized and helps users to find easily. On the other hand, don't use hashtags every time but use only if it is needed otherwise your pins will look spammy and your visitor may decrease.

Follow users with large followers groups

When you choose whom you want to follow, think not only if their pins are interesting but also have good followers. When you follow such users, if they follow you back you may get huge follower base which is an assured advantage of increase of pin/repins to your posts. Therefore, choose such power users for which one repin can bring more repins to you.

Repin interesting stuff

Like other social media network, interplay with other users is of crucial importance on Pinterest. This makes sense when you like or comment on other people's pins. By this you can get in return some repins, likes, and comments from other users. However, you don't need to repin, comment, or like every single pin from the users because this reduces your share.

Use videos, charts, infographics

Pinterest is not only about good images but also consists of video section as well. Though this is not YouTube but you can also upload videos about your business. In addition to images, you can include various types of sheets and infographics which are popular among users.

Pinterest is an interesting place and its main benefit for you is the publicity you get and valuable promotion means more sales to your site.

Helping Webmasters with Hacked Sites

When the website is been hacked it is an unexpected thing for site owners. You need help from webmasters to prevent the sites from being hacked. When the site gets hacked you must inform the webmasters to provide clean up steps and resources which will be useful.

The important thing is to check the pages which link to your search results are safe. When the site is hacked the information produced by the site will be malicious content, at that time you can take help of the webmasters to alert the users by displaying as warning in search results.

If you give the necessary information for webmasters to clean your websites then it can be soon recovered. In the webmaster tools report you can check the message like your site has been hacked. Google chrome search results can alert visitors by showing the message like "This Site May Harm Your Computer”. At some special reasons you can keep the information in malware section of webmaster tools report.

Injected Content

Hackers try to use the spam content and injected links to the sites as they wish, this becomes a difficult task for the webmasters to detect the injected links which are been hidden. The site may be compromised in such a way that the content will only be displayed when the crawlers visit the site.

When you find such sort of things you can send a message to webmaster tools with details. By using fetch as Google tool you can check the content. At .php files and template files you can look for the source.

Redirecting Users

The hackers’ main target is to redirect users to malicious sites, and then they target all users or specify those who are coming from search engines. When you visit your site and it has been redirected to other page you can confirm that your site has been hacked.

When you check the server configuration files, you can get some information because the hackers’ server configuration files to different content to different users.

The malicious behavior can be accomplished by injected links to your java script of the source code. The designing purpose of the java script is to hide from view for terms like eval, decode and escape.

Cleanup and Prevention

Through Google webmasters tool you can clean up the site. Google will regularly check the site, whether the problem has been prevented or not, but it may take some time. When the review is posted it examines sooner and removes the malware warning very soon.

Ecommerce Content Doesn't Have to Suck

Just for the reason that we have an ecommerce website, it doesn't mean that we have an excuse to consider quality content. In case of ecommerce websites there are many equal opportunities for content from both an SEO and sales perspectives.

Write Product Descriptions That Don't Suck

Majority of ecommerce websites don't specifically write meta descriptions for their products. They use the product “back of the box” as meta description, combine it with specifications and call it good but it isn’t good. Product meta-descriptions are perhaps the easiest way for companies to write quality content. We should speak to customers in a way that helps both SEO perspective and improves our brand's personality better.

In addition to increase in the entrances, there was increase in the amount of time customers spent on pages and decrease in exit rate on the product page.

Ultimately it’s all about understanding the needs of our audience. Because our audience constitutes primarily of pool players, players to write our product meta descriptions. Additionally, we also tapped our own organization's resources, to write product meta-descriptions.

Give Customers Content they'll Actually Find Useful and Interesting

The primary purpose of our site is to sell product, it doesn't excuse us from creating content that our customers will find useful, amusement or thought stimulate. For example, we are in the business of selling pool cues, so one of the first non sales based content we created was our Anatomy of a Pool Cue graphics present. We didn't create it as "link bait" but we created it because many customers have asked for. If other sites pick it up then we just want to help our customers help themselves. That’s the question we should try to ask while adding new non-product based content on owner site.

Do Research and Report Your Findings

Before selling our products that are in a niche, trying to find data can be a problem. This can be solved by various researches. For example, every pool cue tip has a different solid matter but most manufacturers simply list them as different matters which really don’t help customers. We couldn't find data more useful; we set our test parameters of all of our tips. The result allowed us to attach a specific number to each tip, which allowed our customers to make a more education decision, that’s because of invariable new products; we can continually update the content on this page.

Give Your Customers Articles with Actionable Points

To increase sales one should talk to customers a lot. As a result, we know what's important to them from a pool playing perspective. One should regularly look for ways to improve their game, so we should communicate with some expert player so that they can write quality content to sell goods for customers and that are easily searchable.

Use Your Blog for More Than Just Self-Promotion

Blogging is not easy especially when it comes to creating corporate blogs. We don't blog as much we should do, but we do customers required information only. The posts change quite a bit, but we think our popular posts will show our corporation personality. In the end, it depends on audience what they search for and need. Whenever we write anything we should think about audience what they will like. If our content is unique then audience, the back links and shares will come automatically. Even if we don't get back links, we still know that it will create a quality experience to the customers and we will get good visits or business for sure.

Importance of Quality Website Content

Quality content on your website plays a vital role in promoting your site. Quality content should be unique, user friendly and relevant to the users search queries. To produce high quality content, website content writers should make sure to use relevant keywords and rich information related to the website. While ranking a website, search engines consider quality content as most valuable.

In SEO point of view, website content plays a key role as it boosts traffic to your site. Better page rank and more quality backlinks are based on unique content. Content must be attractive to the internet marketers, who links to your site.

Why to Create Quality Website Content?

Quality content on your website is one of the factors to achieve success and to improve your website. By building quality content on your website, there will be an increase in the search traffic, gain more number of conversions and get more natural, quality and relevant links.

Factors for Creating Quality Website Content

The following factors are to be considered to generate high quality content:

The main aim of creating quality and unique content is to provide specific, in-depth information of value to the readers.

Through keyword research and analyzing, you get more ideas on how to provide quality content.
The content should be relevant to the main topic of the website. Choose the right tone of your content to benefit the readers by providing unique information.

The information which you provide in the website should be interesting to the visitors/customers. By keeping up-to-date information on the website, customers’ returns to the site and you can gain their trust.

You can convert the visitors into customers by providing valuable content. It is important to focus on the visitors/customers needs and offer effective ways of accomplishment. You need to make sure to describe the benefits of your website and information given in it to the visitors and how they are related to their needs.

Your content should be targeted at communicating audience by questions and comments in your website. This helps your visitors to contact you and understand about the services you offer.

Recheck the content of your website to find out grammar mistakes, broken links and any information which was left out. This also helps to keep the website content up-to-date.

Benefits of Quality Content

1.Greater Chance of Organic links: A hyperlink of others website that links to your website is referred as organic links. Quality content helps in many ways. For example, when more number of backlinks points to your site, your site gets high page rank.

2.Improved Visitor Experience: While reading your valuable content visitors spends more time and gets ideas of your products. There would be high chance that a visitor finds time on your site and converts to a customer.

3.Increase in Conversions: By getting more traffic through quality content, the customers will increase and in turn there is a chance to increase conversions by purchasing your products.

4.Improved ROI: To improve your ranking and to increase ROI quality content plays a crucial role. After Google Panda & Penguin updates, quality content has become a important one. When your content is unique, you can expect high traffic and sales which in turn increases ROI.

Is Hiding Ads from Google Considered as Cloaking?


Cloaking is a practice in Search Engine Optimization where in different content and URL’s are presented to humans and search engine bots. The content or URL’s which are viewed by users will be completely different compared to what search engines read. This is considered as one of the spam techniques because it violates Google’s Webmaster Guidelines where in it provides users of Google with completely different results than what they expect.

Examples of Cloaking:

Allocating a HTML page with text to Search Engines, but the same page is shown to users with images and flash.
Adding text or keywords into a page if and only if User-agent requesting page is a search engine and is not a human visitor.

Cloaking Ads:

If you want to hide ads from Google it is considered as cloaking ads similar to the situation where in content and URL’s are hidden as in cloaking. And it is considered as violating terms and policies of Google.
In this case, it is considered as webmasters just want to show ads only to the users who sign-up and logged in. Users who do not login will not be able to see ads, even if they are registered or not. Only the one who is logged in will be able to see ads. So, even Google bot will not be able to see ads as it won’t be able to login. It can be able to see what a non-logged in person will see and will not see what a logged-in member views. But this will not be considered as cloaking ads as Google bot and non-logged in member sees same thing. There will be absolutely no issues because google bot will not login and it does not even know that logged in users are being served different content.
 If you show Google bot different and non-logged in user’s different matter then it can be considered as cloaking and it in fact revolves around your purpose for representation of different content. If you are showing Google bot one thing in order to rank for particular keyword phrase and if you serve something totally different then this will result in being penalized. 
So hiding ads from Google and not logged in members is not considered as cloaking because both not logged in users and Google bot will be seeing the same content, only logged in members will be seeing something different.

Better not to ‘Depend’ on Google Traffic

Google algorithm updates have a significant effect on driving traffic to sites. Most of the online businesses are depending on Google for traffic. It is obvious that 70% of users depend on Google for searching anything.

Google algorithm effects:

The latest penguin, panda and EMD updates have charged some online businesses very badly. Many online businesses have undergone problems with no traffic, no sales and customers. As and when Google goes through algorithm refresh or update, there are high chances of losing website in SERP’S and these updates are done very frequently.

To Know how dependent is your site on Google:

The clear answer is that it could be better if you avoid depending largely on Google for traffic. It would be more noteworthy if the site is analyzed with the help of Google analytics to know about traffic through Google search engine.  Here you can note down traffic through Google sources such as Google base, Google adwords, Google maps and many more. So, it will be quite simple to compare traffic from Google with Yahoo, Bing and other sources.
This evaluation helps you to determine on which search engine or source you are most dependent and if it is Google you need to take up necessary steps to be careful from the updates. Even you need to know the other sources so that you can gain more traffic next to Google search engine and have to concentrate on them.

Suggestions to receive more traffic ‘apart’ from Google:

The best option is to even find other sources and promote equally to  Google for all online businesses to regain the traffic and sales. Even you need to optimize your site properly so that you no more need to worry if there are any updates or refresh of Google algorithm from Google. Most of the major online businesses have started diversifying their businesses as they don’t want to take any more opportunity to drop their online presence. Here are some of the examples as where you can promote your businesses other than Google. You can look out for traffic by building mobile applications, Social media traffic and another major source which can help you with traffic is the word of mouth that is promoting through conversation from person to person.
So, it is the best option to advertise and promote in Social media through which you can earn traffic, gain customers and sales again without depending much on Google traffic.

Site Architecture & Search Engine Ranking Factors

Website architecture is one of the key roles in search engine ranking factors. It is categorized under on page optimization factor. The right and good site architecture or structure can help with your Search Engine Optimization efforts.

Site Crawlability:

Search engine crawlers going from one page to another page incredibly quickly and the crawlers acting like hyperactive text scanners. They make copies of website pages and get stored in their database. This process is called "index". In general words, it is like a big book of the web.

When anyone searches for the keyword or query, the search engine flips through the big book that it has created already in their database. Search engine finds all index pages from the database and picks out what is the search keyword and what it thinks are very close to the keyword and shows the best ones first as priority wise.

Generally most of the websites don't have the crawling issues, but there are things that can cause problem to the websites.

For example: JavaScript or flash potential websites hide the links, this kind of scripts make the pages lead to hide from the search engines.

Most of these problems can be easily avoided. A good practice is to make use of sitemaps (HTML and XML sitemaps).

Remember one important point that “search engine friendly design (structure)” is also “human friendly design (structure)”.

Site Speed:

Google and other search engines want to make the website a faster place. So it has declared that the speedy websites get good rankings over slower speed websites.

Boosting your websites speed isn't a guaranteed express ride to the top of Google and other search engines results. It is a minor factor.

Is Your URLs Descriptive?

Having the search or target keywords within your domain name or internal page URL's can help your search engines ranking prospects. But it is not a major factor.

Some Important Site Structure Points:

Below given are some important site structure points in terms of search engine factors.

1Site Crawlability.
2.Target keywords in domain name or internal page URL's.
3.Easy navigation from one page to another page.
4.Sitemaps (HTML and XML).
5.Website speed (quick loading time).
6.Descriptive URL's.
7.Internal navigations.
8.No JavaScript and Flash potentials.
9.Site Structure Based On Visitor Language.
10.Integrate Site Navigation Everywhere (home, about, contact, products pages).
11.Content Hierarchy.

Six Common Google Adwords Mistakes

In online advertising Google Adwords is the first point of call. As adwords users are increased, due to this competition good advertising becomes more difficult with same keywords.
Below are the most common mistakes that advertisers make with their adwords campaigns.

1. Not setting up individual ad groups: This is the most common mistake. When you are creating a campaign, you have to focus on the relevance. The best way is to group all relevant keywords as one ad group that gives us more control on the campaigns.
For example: If you have an online mobile shop, you have to target a lot of keywords which are related to mobile phones like 'buy mobiles online', 'online mobile store' and more.

2.Writing a ‘one size fits all’ advert: In this case also relevance is important. When you are writing ads for your ad groups, it is better to write two or more relevant ads for each ad group. Writing single ad for a campaign may not get the relevant traffic.
In the above example if someone searches for 'mobile broadband' and your ad got clicked, you may get the traffic but not the useful traffic. Even a single letter difference can make a big difference in ads. So write different ads and compare which works better.

3.Targeting the wrong keywords: Every time people do not search with the same keywords. The targeted keywords may change every time. You can use Google keyword tool as a guide to find the targeted keywords.
But you don't use all the keywords that Google recommends, because it is not free. If you use all the keywords then you have to pay more. So choose the right keywords to get the targeted traffic.

4.Lack of monitoring: Creating a campaign is not enough for the adwords. You have to check it daily that which keywords are doing well and changing the bid based on the keyword performance. Leaving the campaign alone may not give any traffic or sales.

5.Not having any tracking in place: Whenever you are doing paid marketing, the ROI is an important measure. For this you have an idea about conversion tracking to measure the results. Every campaign should have conversion tracking tool. It is very easy to set up in Google adwords.
Without conversion tracking, you cannot find the effectiveness of a campaign.

6. Getting caught up in a bidding war: In PPC advertising bidding wars are for the rich people only. It is not profitable to bid high for a keyword which is not worth. Some search engines provide features like 'autobid' which will increase the bid automatically for a keyword to maintain a rank. It is better to avoid getting caught up in bidding wars for particular keywords.

Tuesday, January 29, 2013

Latest On Page Optimization Strategies

On page optimization plays a major role in SEO factors. It is one of the important factors to affect the website listing in organic search results. Every webmaster should look into the very first step i.e. on page optimization, if you do correct on page optimization, your site can rank better in search results which can be viewed by the visitors.

The most important factors for on page optimization techniques are content optimization, title tags, meta tags, header tags, alt and title attributes and anchor text links. Placing the keywords in title and description tags, Google can understand what the page is about. Even though above all are the major factors for on page optimization, still they are something more to rank better in search results.

User Intent

First you should think about is what the optimizing keywords are and why you are ranking for those keywords. You should target your optimizing keywords for the users. When you think about user point of view, for which keyword the user is searching and which page you are delivering. Once you understand user intent then you can satisfy the user with replying the website with the answer.

Content Keyword Optimization

Doing exact match of keyword in content is important but lately Google is strict about keyword match optimization. You need to focus more on keyword theme linking than keyword instances. This exactly means that it is related to close concepts that you have to target the theme and you can search for appropriate keyword variations. Such things will provide Google to understand that your page is relevant to targeted keyword.

Optimize Search Listings

Ranking top in search results is not enough, the listing has to sell itself. It’s not that the keyword you are showing is in SERP, but it is that which shows everything in title, url, and description. There are lots of things to develop your search engine listing. Things like video xml sitemap can help the informative listings to click on.


When you are ranking good in SERP listing, you can take the opportunity to highlight your brand. When visitors look at your site and finds good content, he might be interested on your brand. In further he can search for something related among the other search results, at that time when he finds your site he may probably click on your site. This way you can improve in users as well as search engines.

Make it Shareable

When the content is good then it can be definitely socially shareable. If people like your webpage, you should give a chance to share with others. These days social media is been impacted more on search engines. This way your content will round the globe.

Page Speed

Page Speed is also one of the ranking factors. When your pages take lot of time to load because of images or server problems than the visitor may not be interested to visit your site so he will leave the page and will go for other page. In Google webmaster tools you can check the site performance of your site.

SEO Tips for E-Commerce Websites

E-commerce websites help in providing the information about their products. The main aim of an e-commerce website is to make the visitor to buy the products and services. All the e-commerce sites are conceptually identical and has the ability to perform well in the search results and rapidly increase sales through more customers. An e-commerce website needs to be user friendly which makes crawling and indexing your site easily and the success of the website in terms of seo relies on its visitors.

The following are the tips to improve e-commerce seo efforts:

1.Keyword Research

The product pages are the most important in an e-commerce website. Keyword research plays a vital role and the people may target product page by searching for a specific brand of a product. Initially find the appropriate keywords to target with different variations and understand the volume of searches for keywords. Choose the right keywords and target them for right pages. If the page is relevant to the keyword then it leads to higher conversions or sales.

2.Product Descriptions
Informative and unique descriptions of the products can help for the success of an e-commerce business. Both the product descriptions and the images related to the product should be provided. Avoid the product description provided by the manufacturer or copy from other website as it could lead to google penalty for the page because of duplicate content.

3.Product Reviews
Encourage product reviews and testimonials on your site and build trust to complete a purchase so that the online shoppers may read the product reviews while researching on the products. Product reviews can also be helpful in making purchase decisions. The use of product reviews is that the user generated content is added to the products page and can improve your search engine rankings.

4.Optimize in Demand Pages First
As e-commerce websites have numerous product pages, optimization of every page consumes more time. The popular product pages are given priority first and then perform optimization techniques on those pages.

5.Develop a Secure Site to Navigate
If your site is neither secure nor easy to navigate then you don't get any sales. It should be easy for the people to find the page or product what they are looking for. It is better to make sure that each page is clear and have a navigation bar so that the visitors can have a secure online shopping.

Link Building Tools for Tracking Inbound Links

To increase website traffic and online sales, quality inbound links are essential elements in SEO. If you have relevant and authoritative links, your website will get higher ranking and more traffic.

Tools for Tracking Inbound Links:

Majestic SEO: Majestic SEO tool is a free backlink checking tool that provides large amount of data accessible for webmasters and marketers. Majestic SEO allows to run a report for free but a premium account is needed to track competitor website. Majestic SEO released two metrics to measure the quality of domain: citation flow and trust flow. Citation flow and trust flow updates on daily basis. Majestic SEO tool can be useful in many ways like link building, competitor analysis and SEO services.

SEOmoz Linkscape: SEOmoz tracking tool provides ranking for a page by considering number of high quality links factor. It also determines number of different domains linking to your webpage. It has two versions: free and advanced. You can identify quality links to your site as well as your competitor quality links. SEO moz Linkscape tool provide users to judge the keywords that competitors are targeting in their anchor text links and can identify competitor's quality backlinks.

BuzzStream Link :BuzzStream tool makes link building process efficient. It helps to identify contact information and backlinks of websites. Buzzmaker provides contact and website information, which helps link builder to save the time, instead of spending much time on email address and information of others site. By placing site on Buzzstream dashboard, you will get the information on different factors like page rank, links, contact information etc. Buzzstream easily identifies the link request which you have sent.

HubSpot Website Grader: Website Grader is a free marketing tool that gives rating for websites based on inbound links, traffic, popularity and few other factors. It also provides tips to improve website in terms of internet marketing. Website grader tool helps you to generate more online sales.

Open Site Explorer: Open site explorer has two options: free and paid. It gives accurate idea about inbound links to your site. It provides information about page and domain authority of websites. SEOmoz built an open site explorer to check how many inbound links exist in your site and competitor's site. It provides you good stuff like: total number of backlinks, domain & page authority, number of sites linking to your domain and sub domain.

8 Reasons Why Your Site Might Not Get Indexed

When the new website is launched, you should first think of whether the search engines have indexed your site or not. If they don't then your site may not appear in search results position.
So, let’s check about it.

Identifying Crawling Problems 

The very first step of identifying is to type in search box (site: and check how many pages you have and how many of them were indexed. Comparing with your actual pages, if Google is showing low amount of pages then your site is in problem.
Then the second thing is to look at Google webmaster tools report. If there are any crawl errors in the site then it displays in the dashboard. The 404 error which shows a page link cannot be found. If there are 301 errors in your site, then the site is not projected to your visitors.

Fixing Crawling Errors

Here are some types of Issues caused by the following reasons:


The Robot.txt is a text file in websites which allows search engine spiders/bots to crawl or index a website. It appears in the root of your website. The robots.txt file works for specific bots to allow or hide the directories or files within the robots.txt file.

Structure of Robots.txt file

User Agent: *


It is the configuration file used in web servers to control features of the server. The .htaccess file controls and redirect the pages easily. It gives control on your site, and keeps the data saved for htaccess file. So in this way you can recover from a site crash.

Meta Tags:

Meta Tags are special set of HTML tags that contains information about the webpage and its contents. Search engine robots are programmed to read meta tags regarding the web page. The main meta tags that spiders read are description and keyword meta tag. If the page is not getting indexed, check the meta tags in the source code.

Site Maps:

Sitemap is an XML file. It is a structure of the links of website through which a visitor can navigate to any part of website. Try to place the site map on all important pages of the website including the homepage so that it can be easily detected by the search engines and more Google Page Rank can flow to the sitemap.

Poor Internal Linking Structure:

This is very rare case but one must be aware of poor internal linking structure which can cause indexing problems. A good internal linking structure gives success to the website.

Page Rank

Number of Pages Google crawl is related to your page rank.

Connectivity or DNS Issues:

Whenever the site is under maintenance or moved to a new home, in such cases the DNS delegation of crawler access takes place.

Keyword Rank is Still Important

When you perform a search on the search engines, keywords are the key roles. 25% of the search algorithm depends on keywords only. So for a smart business the owner must be able to choose the keywords which are valuable to his/her business.

An idea to separate the keywords by their ranking is important for a business. Every page should contain the relevant terms that will provide the search engine traffic. Depending on those keywords your page will be displayed in the search results.

Google Adwords is the best tool to find the targeting keywords. By using this keyword tool you can easily determine the keywords which you have to target.

Important factors to keep in mind while you are doing keyword research:

Relevancy: The keyword must be relevant to your service. This will provide the right people to click through your website.

Traffic: You have to choose a keyword that most people enter into the search engines. It generates good traffic.

Competition: Focus on the keywords of other sites which are ranking better in search engine result pages.

Commerciality: Choose the keyword which is best explained by Market Samurai when considered keyword rank.

High commerciality keywords generate large amount of profits than low commerciality keywords.

It is better to write content for a website after the keyword research to improve the keyword rank; otherwise you may or may not improve the rank. Check your keyword rank by using Google analytics account.

Ways To Build High Quality Backlinks To Your Website

Backlink plays a key role in SEO success. A link on one page that links to one of your webpage on your website is termed as backlink. If you have more quality backlinks, the more traffic you will get and this will boost your ranking.

While building backlinks to your site, you need to focus on quality not on quantity. The best way to get quality backlinks is having a target keyword in anchor text in text link and backlinks must be penguin friendly. Building high quality backlinks is most important in SEO point of view.

Steps to build high quality one way backlinks:

Article Marketing: Article marketing is an effective way to build quality backlinks to your site. Submit your quality articles to Ezine, Hubpages, Squidoo article sites which boosts your traffic. If you provide quality information, other sites may link to your article which in turn gives you more backlinks to your site.

Social Bookmarking & Networking Sites: Now-a-days, social bookmarking & networking sites have become a best way to build inbound links to your site. High page rank sites like Digg, stumblr, Stumble Upon, Reddit, myspace etc., allows you to get more quality backlinks to your website or blog.

Press Release: Press release is a best way to get backlinks to your site. Write press release and distribute it effectively to different sites like newswrite. Always use target keywords in title because it will boost up your traffic and place anchor text in the body of the content. Information must be newsworthy, effective and add images and videos that relate to your product. Place a website link in the body of content because few people may scrape your content.

Guest Posting: Website gets quality baclinks by one or two guest posts per month. Make sure that content must be effective so that you will get deserved backlinks to your site. Contact other blog owners and ask them whether they accept your guest posts in their sites. Even if the website doesn't have information about guest postings simply send them an email.

Blog Commenting: PostRank is a tool where you can find blogs related to your field with great traffic. Make a note that comments must be brief and effective. Post a relevant comment, so that writer of that blog may visit your blog through which you can get quality backlinks. Even though it is not advisable to go with blog commenting. 

Document Sharing Sites: It’s a great way to get quality baclinks to your site. Write and submit ebooks to document sharing sites and at the same time you have to link back. Major search engines trusted many document sharing sites which help to get quality links.

Web 2.0 Sites: Link exchange with web 2.0 sites is a most trust worthy way of building backlinks to your site. This helps to get ranking with the help of target keywords.

   Internet marketing is a best way to get quality backlinks. With the help of internet marketing you can promote your site and boost up your ranking. Placing site in directories like DMOZ, Yahoo, will build few quality backlinks for free.

Few steps to watch out for:

Write unique content for your descriptions and titles. By doing this you will avoid duplicate content and avoid same anchor text in each backlink.

Consider backlinks from high page ranking website, not from low value sites.

Search for other websites that mark your backlinks as do follow backlinks, no follow backlinks have less value compared to do follow links. Links must be natural.

Anchor Text Optimization

Anchor text is a hyperlink which directs the webpage from the location to another webpage. It is a visible text in blue colored with underline characters. Usually anchor text is a group of characters or words or an image which is a clickable text.
For example: In a sentence if Google is an anchor text, when a user clicks on it then it directs to

In HTML coding, hyperlinks can be created by using href attributes. Href refers to hyper text reference which consists of link location. Syntax for anchor text is


Anchor text optimization: It is one of the major seo tools which can rank a page in better position by using rich keywords in a specific hyperlink. In this optimization, the content in the landing page should be more relevant to the keyword or phrase in the anchor text so that search engines give importance to it thereby rank position varies to better position.

Anchor text optimization can be done in two ways, internally and externally. Internal optimization refers to optimizing of anchor texts which are linking to a particular page can be within a website from other pages. Whereas, external optimization is optimizing of anchor texts in inbound links which are linking back to the website.

The hyperlink helps users to get the relevant content pointing to other web page for their search. Anchor text is one of the major seo tools that is helpful to rank a page by search engines. Sometimes anchor texts are misused instead of landing to right page may direct to some other irrelevant page that is not usable to user. This leads to rank lower to that home page and in turn traffic may also decrease.

To improve page ranking, you can create a few number of deep links to home page with hyperlinks. Use relevant keywords and phrases in anchor text so that a reader can understand easily. Use short description in anchor text instead of lengthy description. Use different synonyms instead of one for particular keyword. Quality back links improve page rank to better position by search engines.

Why Google Algorithms Not Re-indexing Pages?

There are some reasons why and when do a Google algorithm do not index your content. There was an issue where URL’s index rate is going down continuously instead of growing up even after re-submitting them in webmaster tools.

The better way to check how many pages of your site are being indexed by Google is to compare. Number of URL’s submitted by uploading XML sitemap file in webmaster tools is to number of URL’s indexed. If URL’s submitted count is comparatively equal to the URL’s indexed then it is a good indication. If number of URL’s indexed count continuously grows up then it is an admirable signal, otherwise if URL index count goes down then we can consider that there might be any problem or issues. These issues need to be checked, corrected and resubmit the sitemaps.

Where is the problem?

The problem is that whenever search algorithms are improved, then it is decided not to reindex pages which are not much useful to users. Pages once which were in index may not be indexed after next algorithm update because these can be the sites without real content or may be other issue. Even there are pages with soft 404 errors but these pages come up with 200 status code. Being an error page these types of pages should not be indexed. There are even empty pages which should not be indexed in search engines.
Another issue which is observed by most of webmasters is decrease in number of pages in webmaster tools sitemap file. This dropping is because as sitemap references to the URL’s which are not canonical. So, once this canonical issue is corrected then the page will be indexed if there arises no other issue with the page.

How to fix the URL index count?

To increase URL index count there is a need to fix canonicalization of URL’s in sitemap and also a proper status code for page not found errors is to be set. After these issues are fixed the sitemap has to be re-submitted so that these url’s are indexed in Google.

Tuesday, January 8, 2013

5 Facebook Marketing Resources You Didn't Know About

A few years ago, many of us had doubts about how Facebook will be able to get marketers who could spend a notable amount of time and money on their platform, which is remarkably not the case these days.

5 Facebook Marketing Resources You Didn't Know About
According to the report that was put forth this year, 42% of marketers expressed their opinion that Facebook is very critical or important to their business. Now that the percentage has increased to 75% from where it was just a few years ago, this increase has made them to expect Facebook to have some resources that could help companies to market more successfully. 

If you aim at getting more traffic and sales for your website especially through Facebook, then here are the five pages and services which are worth exploring for your business.

Facebook Marketing Page

Facebook has a page that is actively managed to help promoter use Facebook more efficiently. They often post the information about web conferencing to hold marketers, answer specific questions posted to their wall, and ease analysis that help to make more resources to Facebook.

Facebook Advertising Page

If you are trying your hand at Facebook ads and you notice that there aren’t a ton of resources that could help you get set up and operate smoothly. Facebook has provided the help documentation that outlines the work of different things and answer top questions. Contact information can be found for those questions you can’t quite figure out.

Facebook Business Page

This page not just focuses on how you put an ad and spend money with Facebook, but also have created a page to put information that enables to get the most out of Facebook as a business. This Page covers the information like building a page and conversation, gives you tips on how to hold your audience and friends.

Facebook Studio

This service of Facebook has combined together for marketers. Here you will find a group of creative Facebook campaigns, explore outstanding campaigns, a directory agencies exploit in Facebook marketing campaigns.

Facebook Studio Edge

This is a new resource for those who are looking to take Facebook campaign impeachment in house. It is an online course to help marketers stay updated about resources, research, measurement and other useful tools to help with Facebook marketing. Studio Edge has included relevant topics to a number of business audiences from relating to teams to media planners and technologists.

An Updated Guide to Google Webmaster Tools

Google Web Master Tools is a widely used web interface provided by Google at zero cost. It is used by webmasters to check the index rate and optimizing visibility of their websites. Let's take a look of every tool in GWT.

Webmaster Tools Home:

When you first login in to the webmaster tools, you can see a dashboard with list of all websites which you have in your account, and few links to view all messages, preferences, 'Author Stats' (Labs), and a few miscellaneous links under 'Other Resources'.

All Messages:

Google rarely communicate about your site with Webmasters through messages. But in 2012 some probably wish that they communicated a little low with the amount of "love notes" many SEOs have received. You can see the message here if:
  • Google may think that your account might have been hacked
  • Google noticed unnatural links pointing to your site
  • Google thinks that few links which are pointing to your site may be using techniques that are outside Google’s Webmaster Guidelines
Under the "Preferences" tab, you can also set e-mail confirmation messages to: 'only important' or 'all messages'.

Labs - Author Stats:

Since authorship is not particular to a single domain, Google access authorship for individual stats as well as all sites you write. But (go Google+!) should have a valid author profile to check the stats. The stats are more interesting and also good for verifying which URL’s are targeting your ugly mug positions in the SERPs.

Other Resources - Rich Snippets/Structured Data:

If you have not used the rich snippets testing tool, which is now known as "structured data", bookmark it now. With rich snippets testing tool you can test URL’s of your website to check if your author profile is linked correctly.

You can also use the rich snippets testing tool to check if you've setup or verified your:
  • Author Page
  • Name
  • Google+ Page as a Publisher
  • Any structured data found (reviews, products, song titles, etc) in the type of microdata, microformats, or RDFa

Specific Site Dashboard in Google Webmaster Tools:

If you have selected a site after logging, you can see the search tool with details. The site specific dashboard has a nice widget showing:
  • Crawl Errors
  • URL Errors
  • Site Errors
  • Health status of DNS, Server Connectivity & Robots.txt
  • Overview of Queries
  • Sitemaps
Five major sections can be seen once you have selected any site, those sections are: 'Configuration', 'Health', 'Traffic', 'Optimization', and 'Labs'. You can find the most insightful data available in the 'Heath' and 'Traffic' sections, and check what you can get inside Google Analytics.

The 'Configuration' Section:


In the settings you can target your website to a certain country, select a particular domain (www or non-www), you can limit the crawl rate of Googlebot if you choose it.


Sitelinks are automatically chosen by Google to display the brand related queries below your main URL. If you do not want to show a particular URL under Sitelinks, then you can demote that URL and those demote URL’s will not be shown by Google in the Sitelinks.

URL Parameters

URL parameter is used if your site is having problem with duplicate content or because of variables/parameters in your URL’s, then you can restrict Google from crawling them with this URL parameter tool. Unless you're sure about what you're restricting, don't play with the settings here!

Change of Address

If you want to change your webpage name path, you need to do a 301 redirect, and then make sure Google knows about the new URL first.


There are three types of users, site owner, full user and restricted user.

Site Owner: He will have complete control over the site. He can view all the site data, and perform all site actions.

Full User: Full users can view full data, and take most actions on the site, but they cannot add or remove owners or users.
Restricted User: Restricted users can only view data, and able to take some actions. Restricted users are good for clients if you want to give them the most view-only access.


If someone who is able to act on behalf of your site, then those member can be added as your associates, For example: this setting is a way for members of YouTube's Partner Program (probably not you) who have linked their YouTube Channel with your Webmaster Tools account.

Add an Associate:

  1. To add an associate, you must be a site owner.
  2. On the Webmaster Tools home page, click the site you want.
  3. Under Configuration, click Associates.
  4. Click Add a new associate.
  5. In the text box, type the email address of the associate you want to add.
  6. Select the type of association you want.
  7. Click Add.

The 'Health' Section

Crawl Errors:

Crawl Errors provides the required details about your site URL’s that Google is unable to crawl successfully or that returns an HTTP error code. Crawl error lists two type of errors: site error and url error.
Site errors that prevent Googlebot while attempting to connect to your site.
URL errors: Blocking Googlebot from accessing you website specific URL.

Crawl Stats:

The Crawl Stats give suggestions on Googlebot's activity of your website for the last 90 days. These crawl stats implement all content types that we download in the account which are written in CSS, JavaScript, Flash, PDF files, and images. Crawl stats also include AdSense fetches, overview of all Google Services to organize people into groups for sharing across various Google Products like Google Images, Google News, and Google Scholar, etc.

Blocked URL’s Fetch as Googlebot & Submit To Index

Fetch as Googlebot lets you see a page as what Google's spider "sees" on the URL you submit. This tool is used for finding hacked sites as well as looking your site the way Google does. It's a good place to start an SEO audit.

In this year we had a new feature called "Submit to Index". Have you ever changed the title tag and wished Google would update its index faster to get those changes live? 'Submit to Index' can do that. Submitting a page to update in real time in Google indexing can be updated 50 times per month. Very useful for testing on-page changes.

Index Status:

Under health section you make sure to hit 'Advanced' button, where you can see all the interesting index stats about your site which Google shows. Everyday keep an eye on 'Not Selected' number as that button indicates that Google is not crawling your content favorably or you might have a duplicate content issue if that not selected number is rising.


If your website has detected malware by Google it will show malware information here. Google pay close attention to the message boxes and will send messages if Malware is detected as well.

The 'Traffic' Section

Search Queries:

This helps to find some keywords for which our site is shown in the Google results page but not clicked. We can find more queries in Google Analytics interface page. Focus on CTR% in search queries, if it is low then our meta descriptions and title tags are not good, so we can add other structured data or a verified author to help CTR rates.

Links to your site:

This provides the top 1000 domains that have links to pages on your site. We can download the links from "Download more sample links" and "Download Latest Links", the second one is better useful than first. But these domains do not give the information of where these links are coming from.

Internal Links:

Using this tool we can diagnose the pages which are getting more links.

The 'Optimization' Section


Here we get the report about each sitemap which we have submitted for our site. Google scans our sitemaps and find errors if any occurred in its crawl.

Remove URL’s:

You can request URL’s which you do not want to show in Google search results. Only authorized members can do this.

HTML Improvement:

It shows the list of URL’s that our site do not have unique or long title tags, or short meta descriptions. Click on the issue to see the URL's which you have to improve.

Content Keywords:

It lists few keywords that Google thinks about our site. If you do not have spam keywords here then your site is good.

Structured Data:

It shows the structured data along with the number of URL's that Google was able to detect in your site. It is useful to verify all the pages you think are marked up correctly.

The 'Labs' Section

Custom Search:

If you want to build our own search engine, you can build with this custom search. If you want to get results from a particular sites using custom search then here are the two options that are standard edition and site search which shows the result pages with ads and without ads.

Instant Previews:

It shows the preview of your pages that appear in Google results pages. Enter any URL to look at the preview or leave blank to show the home page.

Monday, January 7, 2013

Unnatural Links Notifications & Over-Optimization

Google announced that the websites involving over optimization techniques will be penalized. This result in drop in the ranking positions in the Google search results either with or without notifications due to detection of links related over optimization violations. Google rolled out the first Penguin update on April 24, 2012 to identify web spam. This algorithm change decreased the rankings of the sites which violates the Google's Webmaster Guidelines. Black web spam should be avoided to get quality backlink profile. We need to identify the backlinks and remove the low quality links before the over optimization penalty hits.

Unnatural Links Notifications & Over-Optimization
Unnatural links occur from the public backlink networks, links from junk content posts, link exchanges, profile link spam, high number of links from blogs/forums, single post blogs, links from irrelevant sites, high number of top-level domain links, comment link spam and high volume of directory links etc. 

The following are the links related Over Optimization cases:

Excessive Link Acquisition Check

Acquiring extremely high links is not a good practice. Re-evaluate link building strategies and avoid over optimization. Link acquisition velocity can be checked either by using Ahrefs or Majestic seo.

How to Check for Site-wide Links

High number of site-wide links such as header, footer and sidebar links results in over optimization and site can be penalized. Some of them may occur naturally. A webmaster tool is one of the quickest ways to check your website and detect site-wide links. Check for the links which links several times must be flagged as potential site-wide links and manually checked. The third party service such as Majestic seo, Site explorer and Ahrefs should be used which reports as normal links even though the Google has removed the links from websites from its index.

How to Check for Unnatural Anchor Text Distribution

If all the links in your website has same anchor text then it is known to be over optimized and allowing Google to penalize. It is better to develop natural link profile by creating different variations in anchor text. We can get the anchor text data from different sources like majestic seo and open site explorer.

The following are to be removed in order to filter the data:

Dead Links: These are the sites used to link in the past and no longer link to your site now. The tools used in the 3rd party services are useful in filtering of the dead links.

Deindexed Linking Root Domains: Identify the linking root domains which have been deindexed.

No Follow Links: No follow links causes over optimization and should be removed.

Unnatural Spread of Links Authority

Unnatural link authority spread can be seen on websites who buys more links. Unnatural links include link schemes and buying links. These link profiles come from excessive link building. These unnatural links should be removed from the backlink profile or submit for reconsideration request if cannot be deleted. Too many low authority links leads to over optimization.

Exact Match Domain Update

What is EMD Update?

EMD Update is the algorithm update by Google where it targets “exact match domains”. The main purpose of this update is to filter the sites which are using words that match search terms in domain names.
Exact Match Domain Update

How EMD Update Works?

Even though the EMD update aims to filter the sites with exact match domains some of the sites are not affected with this update. The main reason is that the sites maintain quality content. For example, there are plenty of sites with domain names such as:
                  • for “usedcars”
                  • for “movie”
                  • for “recipes”
There are plenty of sites with search terms in domain names but EMD affects the sites that have more generic keyword in their domains. As these sites have lot of keywords in it which help in ranking high in SERP. Example of these sites is Online- that has long tail domain name. EMD targets a site for update when it lacks the content termed as “parked” or if the content is copied, termed as “scrapped” or even if the content on the site is spun. This means that EMD domains are not being targeted instead EMD domains with low quality content are being targeted.

Types of EMD’s:

There are some factors which help in identifying quality domains which help in EMD update:
  • TLD extension
  • Number of keywords: It is better to minimize the number of keywords in URL. The best idea is to maintain 3 words for .com and 2 words for .org.
  • Hyphen or Non-hyphen: Domains with more than a single hyphen is treated as spam. Even domains with single hyphen are old and have fewer advantages compared with non-keyword domain. So it is better to avoid hyphen domains.
  • Domains with “stop words” are qualified as “partial match domains” – Domains that include stop words are not qualified.

Some of the Best EMD Domain Practices:

  1. The best idea is to spend some amount in buying a good domain name.
  2. It is important to skip second level TLD’s such as .info/.mobi/.travel etc.
  3. Better to have non-hyphen domains or single hyphen domains.
  4. Minimize the number of keywords in URL.
  5. It is good to build a brand name keyword domain.
  6. Another best idea is to build a Geo-local EMD as they offer lower barriers to entry.

Advantages and Dis-Advantages of EMD:


  1. Startup site can gain traffic relevant to long tail.
  2. Can get targeted anchor text and social mentions with keywords easily.
  3. Can dominate a single niche.
  4. Variations in long tail keyword will help in targeting.
  5. Brand mentions and keyword mentions will be the same.
  6. Effective for generic commercial intent queries and local search.
  7. It will be effective to target a single niche with well built micro site.
  8. It will be good approach for the businesses having limited keyword sets.


  • One of the main drawbacks is it limits future brand expansion and creates “brand confusion”.
  • “Credit” won’t be the same for brand mentions and occur as “generic”.
  • It would be harder to claim social media profiles and associate with brand mentions.
  • There is high chance of over-optimization.

Google Panda and Penguin Updates On Website

Google Panda and Penguin updates on website Google Panda algorithm was rolled out on February 24th 2011 to penalize low value quality sites while providing better ranking for high value quality sites and placing them at top of the search results. Website owners disliked that and even after doing modifications to their sites, Google doesn't re-index the site. Google also released a latest algorithm known as Google Penguin on April 24th, 2012. Main strategy of penguin update is to lower the website ranking that violates Google quality webmaster guidelines and to display the high quality sites at top of search results.

Steps to Prevent Website From Panda Update:

  • Duplicate Content: If your site contains any duplicate content then it has to be rewritten. You can use copyscape checker to check the duplicate content.
  • Keyword Stuffing: Include target keywords in title tag and content which doesn't look spam. Pages with keyword stuffing can harm your website ranking.
  • Unuseful Ads: Avoid irrelevant ads and make sure that your site must be with limited and relevant ads.

Steps to Prevent Website from Penguin Update:

  1. Cloaking: This is a SEO technique which displays different content for users and search engines. Cloaking must be avoided or else it is considered as spamdexing technique.
  2. Link Schemes: Never use link schemes, these links may affect your search results as such it is a violation of Google’s guidelines.
  3. To become Penguin friendly, create social media news releases, related blog post and forums.
  4. Penguin targets the site which uses black hat SEO tactics and web spam.
  5. Anchor text plays a prominent role in website ranking.
  6. Create a useful content with appropriate keywords and follow the Google’s quality guidelines.
If your website is affected with Panda it is not easy to come out. According to the SEO poll, only 13% of sites have recovered from Google Panda and it is easy to get out of Penguin update. Penguin 3 is the latest update released on October 5th 2012 designed to target pages with many ads. Panda 20 is the latest update which was released on September 22nd, 2012.

Panda Update Vs Penguin Update

The main difference between Google Panda and Penguin is that the Panda targets on thin content and high bounce rates. Whereas Penguin targets on low quality links, keyword stuffing and over optimized anchor text. Hence, Panda update focuses on low quality sites and Penguin targets on websites to follow Google webmaster guidelines and also aims on over optimization.

Friday, January 4, 2013

Search Engines Ranking Factors 2013

So many people have the doubt i.e, which factors are relevant to good ranking for keywords in search engines. Ranking factors differ from one search engine to another search engine. Every search engine have their own ranking factors.
Ranking Factors

Some Important Ranking Factors:

1. Social media signals show extremely high correlation:. If we get likes, shares, tweets from social media like Facebook, Twitter and Google+, it is considered as good ranking factor in search engines.

2. Back Links: Though the number of back links form the powerful factor in ranking but getting the qualitative back links is also important rather than quantitative back links.

3. Brands Leverage Classic SEO Signals: The web pages with strong brands need not to be concerned with the areas of title tags, headings, descriptions, alt tags etc.

4. Targeted Keywords in Domain: Irrespective of the rumors and contrary, placing the targeted keyword in the domain name is more likely to appear in the top rankings.

Social Ranking Factors
5. Too Much Advertising is Detrimental: Having too many advertisements on your web pages, requires great effort to rank well. Do not advertise AdSense and AdBlocks like porn sites, guns or illegal products etc on your website.

Google Search Ranking Factors:

Google in the starting days had an algorithm which was easy to understand. But now Google is updating the algorithm every month which is becoming more complicated to understand every update. Understanding the Google updates can help explain changes in rankings and organic website traffic and accordingly google algorithm made changes to basic factors of every updates. They are shown in below image.

Google Ranking Factors

The main 7 basic factors of google search engine are,
  1. Backlinks (quality back links)
  2. Content (unique content)
  3. Authority
  4. Trust (visitor trust)
  5. Reviews (visitor reviews)
  7. Facebook Shares

Common Search Engine Factors

Now a days search engine factors are changing dramatically. Here are some basic and common factors of search engines:
1. On-Site SEO: It includes targeted keyword selection, website architecture, internal linking and content optimization. Content optimization can be done, when search engine changes the factors according to algorithm updates.

2. Off-Site Link Building: Every search engine crawler can find out which sites are linked to your websites. This process makes google search engine determine the web page rank. If the page rank is high for a particular website, that website will have good quality links. So it is better to get 100 different website links from 100 sites but not 100 same links from one site. Getting the quality links on a regular basis is always advisable.

3. Content Quality: Recently Google has updated the algorithm called "Panda". With this updates, Google has penalized so many websites for providing low quality content on their websites. The content updated on your website must be qualitative as it is termed as king of the website.

4. CTR: The Click Through Rate gives the percentage at which the visitors click on an ad. If the visitor is spending more time on your website it results in improved rankings.

5 Penguin Friendly Link Building Tips

Google has launched an algorithm update called penguin in April 2012, where many of the websites ranking has fallen down. Due to this penguin updates the spam sites have been hit. if any black hat seo techniques have been used in the past and even though your website is in the safe mode it doesn't mean that you are ranking top in the serp, the results may gradually decrease.
Here are Some Link building Tips based on penguin update which improve the rankings of your website.

5 Penguin Friendly Link Building Tips

1. Get More Fans from Social Media

One of the best way to increase the traffic is through social media. When the new content is posted on any site or blogs you should share the content in social media sites like Twitter, Facebook and Google+, where likes and followers will increase which results in improving the traffic. When you get re-tweets and likes for your content this gives a good ranking in search engines.

2. Link To Inner Pages

A basic seo mistake that everyone does is linking to the homepage. A home page is the important page of every site and hence majority of the back links should not be directed to the homepage. The inner pages consist of the real content and most of the natural links are connected to the inner pages only.
Internal links are important to improve the site structure for right keywords and for linking to the right pages, this makes easy for google bot and visitors to crawl the site.

3. Diversify Your Anchor Texts

Due To google algorithm updates the target has been over optimized for websites, which has been a search engine ranking problem. Anchor text is a word or phrase which is a clickable text in hyper link. When there are many anchor text links to a particular keyword linking it causes a search engine ranking problem.
Diversity of anchor text means linking to a particular page or website. Most of the companies will link a particular website to their company name as anchor text.
Prepare a list of anchor text links to your website which can be downloaded using majestic seo. When the percentage of anchor text keyword is higher than 60 percent then you have to diversify the links with more general keyword phrases.
For example you can use this keywords in anchor text links:
  • Click here
  • Website
  • My company
  • Our company
  • The website
  • Learn more
  • Read more
  • Click details
  • Buy now
  • Contact us
  • More website
  • Official site
  • Visit our site
These are the keywords which can be used to diversify the links.

4. Focus on Quality not Quantity

The Most important thing is to have quality links than quantity links. Google will penalize the websites of back links from unworthy sites. We should stop the mass link building techniques or else you may get slow but steady results of link building.

5. Make Content Link Worthy

Producing worthy content is the basic thing for a website where the visitors should like the content and which should be informative. If the visitors doesn't like content they may leave your site which gives a high bounce rate and less google rankings.
When the content is informative and trustworthy then the visitors will stay longer on the site and then rankings of the site will be improved which is considered as a very good seo technique.