Wednesday, July 24, 2013


   Ranking of your website will be based on the analysis of the sites which link to you. The quantity, quality and relevance of the links matter and these factors influence your site's ranking. The context about the subject matter is provided by the sites that link to you. The quality of your search results may be affected due to manipulating the links and is considered as the violation of webmaster guidelines.

   The following are the examples of link schemes which negatively affect your website ranking:

1. Buying or selling links that pass page rank:

This includes buying a directory listing, exchanging money for links, paying someone for writing a blog post etc.

2. Reciprocal Links or Excessive link exchanging:

This is one of the link scheme where "link to me and I'll link to you" method is followed. This is either done under the links or resource pages.

3. Linking to web spammers or unrelated sites:

You should not link to the sites that are spam such as keyword stuffing, malware issues and hidden links etc.

4. Building partner pages only for cross-linking:

   Some of the examples of unnatural links which violates Google Webmaster guidelines are:
  • Text ads that pass page rank.
  • Links that are used in the content of the article with little coherence.
  • Low quality links from directory sites and bookmarks.
  • Links embedded in widgets and distributed in several websites. For example: Visitors to this page: 1472 Car insurance
  • Distributed footer links of various sites
  • Forum comments containing optimized links in signature
Thanks, that's great info!
paul’s pizza san diego pizza best pizza san diego

Few other examples are:
  • Links from irrelevant sites
  • Links from Blogs
  • Links from article directories
   Your site can be hit by unnatural link penalty if you use the techniques which violate Google quality guidelines.

How to Prevent Page Rank from Passing:

You can prevent page rank from passing in various ways as shown below:
a. Adding "rel=nofollow" attribute to tag in advertising links

b. These nofollow links can be redirected to a page which is blocked from search engines with robots.txt file.

     You can create unique content and get relevant links to you from other sites so that you can gain quick popularity. If you have useful content there would be more chances for the readers to find valuable links to it. You need to think whether the content would be beneficial to the visitors or not. The quantity of links doesn't matter but the quality and relevance of the links matters. Identify the unnatural links and delete them carefully. You should not delete your natural and beneficial links which affects your site rankings.
   The preferred method is to build natural links without participating in any link schemes. Remove all the bad, spammy and low quality links. It is important to have healthy links and maintain natural link profile.

Google Penguin 2.0 Insights and Recovery Tips

Google Penguin 1.0 which was rolled out in 2012 April is one of the biggest algorithm updates. This update had its impact on large number of sites that featured questionable link profiles, low quality backlinks, keyword rich anchor text and sites that were overly optimized for single term.  Penguin 2.0 was released on May 22nd which is important for us to know how it is affecting sites and what are the tips or steps we have to take if our site is affected by this algorithm.

Penguin History:

         Google algorithms are designed and rolled out with an aim to give better, relevant search results for the users to find appropriate information. Panda algorithm was designed to focus on sites that provided low-quality content. While Penguin algorithm focused on unnatural, manipulative link profiles sites. It aimed sites that were consisting links which are:
  • Coming from low-quality sites
  • Links on the site which are not relevant to the topic
  • Paid links
  • Link having rich keywords
  • Overly optimized anchor text links

Inside Look of Penguin 2.0:

Vast deal of assumptions is going on whether Penguin 2.0 also targeted the same as its earlier version. But according to Matt Cutt announcement, the Penguin focuses on penalizing sites which uses Black hat SEO techniques and site that gives a good user experience ranks well. It means that sites which involve in buying links, spam low quality directory sites with links and paying for leaving huge number of blog comments will be affected. While the sites that involve in creating quality content, building ethical links and does not participate in techniques that give immediate results will be improved in overall search rankings.
Penguin 2.0 went deeper into site for detecting spam and evaluated inner pages just not the home pages. Before it was not known that Penguin 1.0 just involved in evaluating homepages inbound link profiles. Now as internal pages are also being analyzed there is no chance of escaping from Google penalty if any site deals with black-hat seo techniques.
It was estimated that Penguin 2.0 influenced around 2.1 percentage of English queries and those sites which were reflected included those where numerous results from same site dominated first page.  It was even seen that search queries that are associated with spammy results also dropped in search results.

Ways to get Solid Rankings in Google search results:

It is needed to understand and cleanup link profile if a site is affected by Penguin 2.0 algorithm update. Following are the steps that are to be followed:

Step 1: Need to know what you are looking for

It will be challenging to conduct specific analysis and recover strategy, before we start with site link audit profile we need to keep certain points in mind and these are:
  • Quality of site linking: To analyze a site quality we can take Page Rank and Domain Authority in to account. Other factors to analyze quality of site are quality of design, authoritative & well written content, is site active in blog comments and social media?
  • Relevancy of the links: It is important that sites that link to our site should be related to our Niche.
  • Were links acquired quickly: If there is a sudden increase in number of links than there are chances of being penalized by Google if it is linked in unnatural way.

Step 2: Take an active look at site profile

If we have a clear understanding of what we are looking for then it can help us to dive and take a closer look of link profile. We need to conduct a link audit by taking time and analyze which link has to be kept and which has to be removed from our site.

Step 3: Special Attention is to be given for anchor text

Anchor text has to be analyzed with special attention. If anchor text is keyword rich then it will result in penalty due to Penguin algorithm. Also even same anchor text should not be present for too many links.

Step 4: Using disavow tool for offending links

It is the best option to make use of Google’s Disavow tool to remove questionable links from the site to recover rankings. After removal of spam links from site, they are submitted to webmaster tools and then it will be re-crawled.

Step 5: Concentrate on building Quality links

The last step is to build high quality links and maintain quality link profile. To maintain good links we need to consider relevant, high quality sites and feature diverse anchor text.

By considering above points we can rebuild the site and get ranked in Google search results.

Google Confirms Panda Update Is Rolling Out: This One Is More “Finely Targeted”

The latest Panda update was rolling out, one that seemed to be “softer” in kind than the last updates, where large number of webmasters who were primarily hit by the Panda are now claiming recuperation.

Google has confirmed a Panda update is rolling out and this particular update is “more finely targeted.”

As you can recollect, Google informed us new Panda algorithms are being thrust out occurring once a month over a ten day time. Matt Cutts did mean there was delay in shoving out their monthly Panda refresh because they have a desire to release signals that would mollify the algorithm a bit.

Google confirmed that a Panda update is being released and said:

In the past few days we have been delaying the new panda update that include new signals so it can be more properly targeted.

This is despite Google informing us they are unlikely to confirm future Panda updates.

There were large number of SEOs and webmasters claiming recoveries. We surely hope you have recovered from hit of algorithm.

We are not assuredly sure what number of Panda updates was up to, but as far as my information I would label it Panda Update – Version 26th.

Here are all the releases so far for Panda:

Panda Update 1, Feb. 24, 2011 (11.8% of queries; formal declaration; English in US only)
Panda Update 2, April 11, 2011 (2% of queries; formal declaration; rolled out in English internationally)
Panda Update 3, May 10, 2011 (no change given; confirmed, not formal declaration)
Panda Update 4, June 16, 2011 (no change given; confirmed, not formal declaration)
Panda Update 5, July 23, 2011 (no change given; confirmed, not formal declaration)
Panda Update 6, Aug. 12, 2011 (6-9% of esp. in many non-English languages; formal declaration)
Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not formal declaration)
Panda Update 8, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
Panda Update 9, Nov. 18, 2011 (less than 1% of queries; formal declaration)
Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not formal declaration)
Panda Update 11, Feb. 27, 2012 (no change given; formal declaration)
Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; formal declaration)
Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
Panda Update 14, April 27, 2012 (no change given; confirmed; first update within days of another)
Panda Update 15, June 9, 2012 (1% of queries; belatedly formal declaration)
Panda Update 16, June 25, 2012 (about 1% of queries; formal declaration)
Panda Update 17, July 24, 2012 (about 1% of queries; formal declaration)
Panda Update 18, Aug. 20, 2012 (about 1% of queries; belatedly formal declaration)
Panda Update 19, Sept. 18, 2012 (less than 0.7% of queries; formal declaration)
Panda Update 20, Sept. 27, 2012 (2.4% English queries, impacted, belatedly formal declaration
Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not formal declaration)
Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not formal declaration)
Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; inveterate, formal declaration)
Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; inveterate, formal declaration)
Panda Update 25, March 15, 2013 (inveterate as coming; not inveterate as having happened)
Panda Update 26, July 18, 2013 (confirmed)

Enhanced Campaigns: How to Optimize AdWords Campaigns for Desktop And Mobile

   Enhanced campaigns lets you to easily reach people and enables to target for various devices like desktop and mobile. With the help of enhanced campaigns you can target all device types and location within the same campaign. Google Adwords released enhanced campaigns in February which simplifies the work and manages multiple devices. Let us see how to create mobile ad campaigns.

Understanding User Context:

   Search marketing allows you to connect with the audience searching for keywords which are relevant to your business. Two persons may see for different things using the same keyword depending on time, location and device.
   Let us consider an example as shown below:
   Two users A and B are searching using the same keyword "pizza" but time, location and devices differ. User A is at home and searches for "pizza" at the lunch time in his laptop. He may be seeing for the places where pizza is available with delivery. User B searches for the same "pizza" on her phone, downtown around 7:30 pm. It seems that she wants to know the directions to the pizza restaurant.

So user context depends on time, location and device while searching. If you understand the user context then you can increase your adwords profits as shown below:

You Can Create Better Ads:

   Understand user's context and make your ad copy useful to the searcher. The ad for user B should have directions to the store as she uses mobile device and ad for user A should have delivery options.

You Can Bid Smarter for Clicks:

   B2B Company values desktop search traffic on weekdays, whereas you can value mobile traffic on weekends and nearby your location.

How to Optimize Your Adwords Campaigns for Desktop and Mobile:

Bidding Options in Enhanced Campaigns:

   You can adjust your bids depending on the time, location and device using enhanced campaigns where the search begins.

You can find the location, ad schedule and device tabs in the Campaign Settings tab.

Location Bidding Options:

   Suppose your business is based in U.S and you sell the products to both U.S and Canada. If the clicks convert at lower rate from Canada when compared to the clicks in U.S you can optimize your bids less on clicks by 30% from Canada. You can bid more or less on clicks from any location. It ranges from +300% to -100%.

   If you get more number of clicks from near distance compared to far then you can bid more for the near searches and less for which is far as they would be less valuable to your business.

Time Based Bidding Options:

   It is easy to bid either more or less based on time of the day and day using enhanced campaigns. You can bid more on weekday traffic by 30% and less on weekends by 50% as shown in the below example. 

You can also adjust the bid based on time which would be valuable to your business.

Mobile Device Based Bidding Options:

   You can adjust the bid more or less for the mobile devices traffic. There would be more chances for mobile searches if your business accepts the orders through mobile. In the below example the mobile bid has been adjusted down by 10%.

 Smarter, User-Context Aware Ads:

   You can displays several user-context aware ads using enhanced campaigns based on where the search begins from. If you want to display the ads for mobile devices then you can activate the check box which shows as Mobile Preferred Ad as shown below:

   Even though these features were available previously for advertising they were used rarely as they need to create separate ad campaign. You can manage your ads and bids in already existing single campaign using enhanced campaigns without creating new one.

7 Critical Considerations to be Followed in Web Redesign

Redesigning a web-site will be inspiring but at the same time it should be remembered that focus should not be only on visual design of site. There is a need to take SEO, content and functionality into account along with visual design of site while redesigning. A site redesign will be successful only if visitors are converted into customers through traffic, conversions and functionality.

Following are the considerations that are to be implemented in site re-design:

1) Before you start, do research

If a re-designed page has to yield good results then we need to know whom we are targeting. Market research, keyword research, community mapping are to be focused while considering design, functionality and SEO. It should be noted from beginning of our plan so that it will be carried out in all aspects of redesign. Research on what has given us good results for the site at present and in the past. By doing this our site will be fueled with proven techniques and allow to estimate sites success in post-launch only.

2) Website Structure:

A redesign gives us an opportunity to reorganize the way our site is structured and it is just not to give a fresh look to site. There is a need to give priority in analyzing effectiveness of current site and for that we need to make sure that information architecture is only setup for optimal visibility and conversions. Those are:
Analyze which pages convert best?
Best and most common route through the site?
Are there any pages that have high bounce rate?

The above points help in improving architecture of the site. Other factors that are to be considered are site goals, personalization, site complexity, time frame and budgets. Each of these factors has their own advantages.

3) Canonicals and 301- Redirections

List out all the pages, incoming links and high ranking pages along with sub-domains from the beginning. It is important to redirect traffic to new pages or new URL as the URL structure is changed and this brings even SEO rankings as well. Using tools we can analyze from where incoming links are coming from and going to. After backward links list is ready they are to be mapped along with all pages to new URL with the help of 301 redirections. At the same time it is best to implement canonical to avoid duplication.

4) Navigation:

There will be a significant effect on visibility and success of re-designed website basing on how humans, search engine spiders can navigate site. A site structure has to be considered from two points:
  • How your site will be noticed by people?
   For people to find the site URL structure plays a vital role. That is in URL structure we need to consider whether it is designed without any unnecessary characters, is it shortened? Are target keywords placed in URL? Redesigned URL structure and site map have to make search engines know what each page is about and each campaign should have most important terms in it.   
  • Once visitors view our site how will they navigate?
   Note visitors flow and consider the entry pages to site, what visitors need to do once they are at primary pages, how should be they directed in order to complete that action, is there any way to shorten the action and at same time increase conversions? By considering all these points and improving them will drastically improve performance of the site.   

5) Where to fit the Content on the pages?

A site’s success or failure is based on quality, visibility and relevance of the content. So it is an important consideration in site redesign. One of the basic considerations in what type of content has to be published on the site is:
  • Is there any plan to place a blog in the site?
  • Whether the content in blog will be mostly visible or else it contains long informative articles?
Another important thing is that whether content is placed in sites other than blogs that are:
  • Are there any chances of publishing white papers, eBooks, and video tutorials?
  • In that case how will they be delivered, should return email address be placed in it, is it public or available only to members?

6) Technical SEO:

In technical SEO we need to consider the following three key areas in re-design they are:
 1. Page load times
 2. Compliance
 3. Coding

7) Testing:

Testing of the site is very crucial and certainly gives out great advantages.

So it is very important to consider all the above points and implement them in redesigning of the site.

Top Retargeting Tips Guide

It looks like every marketer is looking to work on retargeting campaigns these days. Why not? Ads tend to work more excellent when they're directed at the right people. However, some people are retargeting considerably more than others. Here are few tips to obtain the most out of your work.


Set the stage: Retargeting campaigns are with a specific purpose to push consumers through the sales stack, but you need clear evaluation and understanding of retargeting to know if your campaign is running. Make sure you have the right metrics in place for hit at the top, middle, and bottom of the stack. Here are some recently asked FAQ's to get you started.

Build on your past successes: Once you know what works for your website, do most of it. Be sure to recognize the best keywords from your marketing campaigns. Once you find search retargeting campaign is running good, decide which keywords are getting the good ad sense and then target users carry out similar searches.

Get straight to the point: All the retargeting campaigns will not work until your creative isn’t right keywords. Be sure your originative contain a firm call to action and take the user to capture page that makes the desired action as simple and straight forward as feasible.

Stretch your dollar: Retargeting can quiet be a big deal. If you don't believe it just collate cost of top keywords on AdWords with the retargeting campaign as same keywords prices differ.


Relevancy is key: Your retargeting campaign is strong only when good data posterior to it. Use according to a program Site Retargeting to utilize as much data as possible, from the reference sites and keywords that take users to your website, that users travel after visiting your site.

Be Methodical: Don’t neglect to place transformation and expulsion pixels on your affirmation pages. After all, if somebody just purchases your product, it doesn’t make much mind to continue to aim them with exhibit ads. You must be likewise obsessive about your testing. Just in as much two ads show up similar doesn't mean they'll execute alike.

Don't ignore the Facebook Exchange (FBX) opportunity: If you’re running a retargeting campaign, it only form feel to experiment it on Facebook Exchange. The early consequence show that FBX campaigns are fulfilling brilliantly. Remember one thing Facebook Exchange allow you use 1st-party and 3rd-party data. If prices are still on low side, there isn’t much to lose or gain.

Google’s Disavow Tool: What You Need to Know, and Common Myths

With the impact of Google Penguin 2.0 update, webmasters paid more attention on website link profiles. If your main objective is to remove spam links from your site history then you have great news for it. You can do it with the several mechanisms.

What is Google Disavow?

The Disavow tool has launched in October 2012 to clean up all unsavory websites from your link profile. This tool gives you a list of bad websites which have violated Google considerations. Disavow makes Google to know that you are willing to ignore shady websites from your link profile.

What To Do Before You Resort to Disavow

If this tool is used in incorrect way, it harms your website badly. So, this tool has to be used with caution. Before you begin to use Google disavow tool, request site owners to remove your links for their websites. It is better to approach in systematic way which will be helpful for you. First, record all the information such as:
1. Site Name
2. Link URL
3. Anchor Text
4. Link that is linking to your site
5. Contact Information (Email Id, Contact Us form)

Before adding website to disavow file, try to send your message request at least three times week by week. Request the site owners in a polite manner to remove your links by making use of above information.

After a third try, if you haven't got any response from them, make a note of it.
If your site has been hit with manual penalty or some other problem and you are submitting reconsideration request to Google then include above information along with it, So that Google can able to know that you have taken steps for cleaning up your link profile. If webmasters asks you to pay for link removal, simply ignore the message and just make a note of it. Later, attach their message along with disavow request and submit.

Common Myths About Disavow Tool

1. Google disavow tool was rolled out in the month of October 2013 with the aim of cleaning up unsavory websites from link profiles. Google says that, this tool is an advanced feature and it must be used with caution. If it is used in incorrect way, it harms your website performance badly in search results.
                     Therefore, it is eminent to follow the procedure that Google has defined. If you are not well known with this tool, hire a professional who is familiar with this tool to check the results of your audit. At the initial stages, it is quite necessary to formulate the strategy on, which links you need to keep and which links you need to follow up.

2. Everyone make mistakes, it is quite common. If you got a spam warning message form GWT, don't worry!! Because Google gives you one more chance to rectify the mistake. But you have to make sure that you put a serious effort to reach out to the webmasters of unsavory websites in your link profiles before using disavow tool.

                   Moreover, Google considers disavow links like suggestions and act on it when manual review is triggered. It takes weeks or months to complete the task.

3. If your site has been affected by manual penalty, there are several factors you need to consider. Matt Cutts stated that, before you begin to use disavow links, it is important to view your link profile. After that, you need to take proper steps to clean up the shady links.
You need to know that, manual penalties are quite different from Google updates such as panda and penguin updates. Your site may have been dropped with the use of black hat techniques such as keyword stuffing, abundance of bad links and more. You need to submit reconsideration request to Google to get back into the Google's search results. In addition, Google disavow tool will help you to get back on to the track but you need to consider this strategy as a part of the task after doing many trials.

                   Penguin 2.0 was updated to improve search experience. To keep your site safe and protect it from Google updates and penalties you need to conduct site audits and security audits regularly. 

Tuesday, July 9, 2013

Google Webmaster Tools Cleans Up With Dashboards, New Navigation & More

Webmasters rely on GWT to come out of problems with their website and to solve the website issues. Webmaster Tool provides you detail information about your website pages that have included in your account.
After Panda update, Google has announced clean up of GWT to make webmaster to access the account very easily and quickly. At the left menu Google has added a gear icon for quick access as well as Google added one more new appearance pop up feature. GWT helps webmasters to solve their problems in three main areas.

The three changes include:

1. Updated Dashboard
2. New Left hand Navigation
3. Home Compact View

Below one is the snapshot of revised dashboard

New Navigation:

The new GWT navigation is changed to be more representative of the content and this new feature is designed differently to make it synchronize with the levels of the Google flow.

I. Crawl Section: In crawl section, GWT discovers the crawl errors such as links (404 pages), as well as server problems, access denied crawl stats, URL parameters and also fetch as Google feature.

II. Google Index section: Here, Google keeps the track record of how many pages are indexed. In addition, you can see the index status (overall indexed counts), content keywords.

III. Search Traffic Section: This section will display you about how your pages are performing in search results such as search queries, Links to your site, incoming links from other internal pages.

IV. Search Appearance Section: However, this section marks up your web pages to help Google during indexing. It contains structured data dashboard, Site links, HTML improvements and data highlighter.

Search Appearance Overlay:

Below image describes the result page as well as UI feature.

Google has added Pop up feature to explain about search results page for webmasters who doesn't have an idea on how Google works. Click on the question mark button which exists next to the search appearance navigation menu at the left side to access the new feature.

Bing Gives Webmasters More Control Over Deep Links

The webmasters who are waiting for more flexibility over which deep links are displayed in Bing's search results. Bing has formal declaration about changes in management of deep links.

Now we can remove particular links that the Bing algorithm has resulted from Bing Webmaster Tools and selected to appear under the main search results.

Previously, Bing was selecting apparently random news stories to seem as a deep link to an online newspaper, rather than selecting a much more suitable section of the site, such as sports or obituaries.

The site owners now just require to login to their Webmaster Tools account to display own deep links in search engine results (Bing) and we can also block any URL you don't want display.

You can also block specific countries or regions deep links to prevent display with the use of this tool. This would be particularly useful for sites that serve other country audiences but are still very country or region centric for real visitors.

The blocked URLs will automatically expire after 3 months; therefore you need to log in to your account to extend, if needed. You can see reminder option in webmaster tools account that you have expiring add blocks URLs. The blocked deep links will move automatically to the new blocking tool.

By adding new Bing adding tool help webmasters manage the issue of deep links.

A Phantom & Penguin One-Two Punch Updates From Google

Phantom and Penguin were two major updates from Google in the month of May which had its consequence on webmasters, SEO professionals and business owners. Phantom was updated on May 8th and its effect was heavy where in it targeted only content but not links. While Penguin 2.0 was updated on May 22nd and it went deeper than previous Penguin updates which targets unnatural links.

 Phanteguin is most powerful one-two punch from Google:

It is very tough for business owners whose sites are hit by Panda, Penguin or Phantom algorithm updates. These are worst situations for the online businesses who do not understand exactly what had happened, what to do, how to recover and at the same time they will be losing their businesses every hour.
But it can get worse:
It is the nastiest situation to get hit by two algorithms one after other rather than only one algorithm. Last year the same happened with updations of Panda and Penguin known as Pandeguin. At present these two punch updates are known as Phanteguin, which has its consequence severely on site.

Google Penguin and Phantom: Digging into deeper

Penguin and Phantom each of these algorithms has their own set of uniqueness and targets according to which they were updated. Penguin 2.0 went in to deeper that is it took internal pages also in to account and is very finely tuned to knock all unnatural links. Penguin aims at penalizing websites having unnatural links that uses exact match anchor text from low quality sites.

   Penguin 2.0 covered unnatural links from blogs, spam directories, spam comments, link networks of public and private and so on. While Phantom focused highly on content rather than links and is much similar to Panda than Penguin. Just a day after Phantom update we were able to see a severe drop in organic traffic that was hit by the update. There was a traffic drop of 25 to 45 percent for the websites that were affected by this update. Below is one of the examples of Phantom update.

Insights of Phantom:

On May 9th we could see amazing and incredible for confirming that there was a significant algorithm update.

As the analysis goes on the sites that were affected by Phantom we could be able to see more like above. Phantom just concentrated only on content but not links. Phantom has taken into account a number of content issues which ranges from thin content, affiliate content, scraped content, low-quality content, etc. It also considered heavy cross-linking from company-owned domains that used exact match anchor text.

Panda Greased the Skids for Phantom:

It was observed that most of the sites which were hit by Phantom were previously hit by Panda update also. It also shows that these sites had struggled with quality content issues but in the mean time again they were hit by Phantom.

Indentifying Phanteguin:

Effects of any algorithmic updates on sites can be easily found out using Google webmaster tools and Google analytics as these tools help us to identify traffic drop on particular days. Now to check whether a site is affected by Phantom or Penguin updates we need to check if there is drop in traffic on May 8th or May 22nd respectively. If a site is affected by both updates then it will be Phanteguin.

We can find if there is drop only from mobile search or image search or only web etc, using filters in Analytics. It is important to note the date on which traffic is dropped so we will be able to know which update has affected the site.

To check affect of Phantom using Location Filters:

Another important aspect is that we need to check drop of traffic using location filters because if all countries are taken into account we may not be able to see drop in traffic.  If we notice drop then we can be able to compare time-period after traffic drop to time-period before traffic drop. By exporting data from Google webmaster tools as soon as a site is hit by algorithm updates we can analyze them clearly. Because in webmaster tools we will not be able to see previous data but we can see only data of 30 days.

Analyzing using Google Analytics:

Google analytics will help us to analyze using advanced segments which is an added advantage. We can start by separating Google organic search traffic and then need to view tending graph. For analyzing Phantom update we need to check drop in traffic from certain keywords and landing pages. This helps to identify possible content issues and pages to analyze further. Advanced segments will let us slice and dice traffic and helps us gain a solid view of traffic drops over time.

Analyzing Phanteguin is not an easy task:

If the site has hit by two updates then we need to analyze in both aspects. That is Phantom has to be analyzed using quality content filter while Penguin has to be examined for unnatural links. But it is better to analyze Penguin update first. To do analysis of Penguin affect we need to analyze inbound links, then flag those unnatural links, organize them and the last step is to remove them.
To analyze Phantom affect there is a need to objectively review whole content of site and identify the risks. Next step is to come up with refining or gutting site content.

What to do if you’ve been hit by Phanteguin:

  • If a site is hit by Phanteguin, here are some tips which help us to move in right direction:
  • It is important to identify which algorithm update(s) has hit the site. This is essentially important as both of these algorithm update targets different factors.
  • In case of Penguin update, we need to concentrate on analyzing link profile and flag spammy links. Then remove those links the best you can. If we are unable to remove links then we can add the links to disavow file.
  • While in Phantom update, we need to heavily analyze content, and make analysis of the pages that dropped significantly after the Phantom update. 
  • Once we completed analysis and taken action, there is a need to wait for another algorithm update (Penguin, Phantom, or Panda). Panda is now rolled out once per month over 10 days, and the next Penguin update refreshes soon.

10 Essential SEO Tips for E-Commerce Sites

   All the old methods which have been used for optimizing your e-commerce website need to be changed now. You can make your e-commerce site perform well by following some of the tips.

Below are 10 essential SEO tips for e-commerce websites:

1. Work Out a Really Exhaustive Keyword Research Report:

   You need to select certain keyword terms and phrases and test them on keyword tools depending on the type of your business, services and demography you target your customers. You can use Google keyword tool for this process. You should generate keywords list for which your competitors rank high and show in the report. If you have an idea regarding your competitors’ keywords then you can be able to choose the keywords easily.

2. Check Your Site's Structure, Make Sure it is Easy to Use:

   You need to check your site structure and make sure to have your site's structure user-friendly. It must be easily navigable for your visitors so that they must be able to find the page they are looking for.

3. Basic On-Site SEO Needs to be Handled Well:

   On page factors are more important for ranking well in the search results. You need to optimize your site considering the on page factors such as title tags, meta descriptions, URL structure, rich snippets, alt tags etc in correct way.

4. Create Blog On Your Site Which is Frequently Updated:

   Some of the sellers avoid creating a blog in their e-commerce site thinking that it is not useful. This way of thinking is absolutely incorrect. Create a blog on your site and update it frequently so that Google bot assumes new thing is happening which needs indexing.

5. Check Your Site for Proper Social Media Integration for Product Sharing:

   Sharing a product plays an important role in social media sites like Facebook, Twitter, Google Plus, Pinterest etc. You can achieve authority to your e-commerce site by social media sharing.

6. Respond to Your Reviews For Customer Satisfaction and Reputation Management:

   In order to buy things from your site, reputation is one of the factors to be considered. You need to respond to the reviews in a positive manner as negative reviews may have an effect on the reputation of your site. So you need to respond to every review for your customers’ satisfaction.

7. Make Your Site Multi Device Optimally Designed:

   Now-a-days optimizing the mobile version of the site is being into practice. At present people are using several devices such as smart phones, laptops, tablets, desktops etc. You need to design your website such that it should be easily navigable on any devices and work on the SEO part for those different devices.

8. Get Attractive Content Marketing:

   You can offer anything new apart from selling via content marketing articles. You can do this for an e-commerce site by knowing about your product niche. For example, if you sell apparels you can make a new look to the fabric.

9. Get Authority Links Only:

   Authority links play an important role in SEO. If you have either informational site or e-commerce site you need to make sure to have only authority links to your site. You should not have links from spam sites.

10. Create Relevant Social Media Channels for Better Visibility:

   You have to create social media channels like Facebook, Twitter etc which talks about your site and the products you offer. By this your site's visibility and target customers increases thereby conversions increases.

How To Retrieve Your Site From Google Penguin Penality

Penguin 2.0 was updated during May, most of the people are very anxious about what penguin 2.0 filters and what may be the drastic changes in analytics. Firstly, most Penguin filters the spam links of thousands of links and after that they send the filtered data to SEO community.
After analysis of spam links of thousands of sites, Penguin targeted at sites that are affected highly by money keywords in anchor text and sites which have a brand anchor texts are at safe. When you look over at back links profile, you can observe the heavy usage of money keywords which affect the anchor texts.

Another most crucial point is you have to keep eye on all your ratios, check whether they are natural or unnatural. If most of them found unnatural compared to other sites in niche then they may be resulted with automatic penalty. Some sites with money keywords in anchor texts may not be penalized but some do.
It’s all because of comparison ratio of your back links profile with competitor’s site targeted by penguin 2.0. With the use of Google algorithm, the ratios and averaged can be found. This data parsed through penguin 2.0, then it will sort out the sites with too many red flags and that sites are considered as unnatural and those get penalized.
Google check over the ratios like Unique c class links in domain, Link velocity trends, Number of keywords ranking, Number of links from NA, PR0 domains, PR of back links, Click through ratio, Anchor density, Indexed pages, Traffic and Title rank.

Then how to protect your site from penalties by Penguin??

  • Link detox is the tool which facilitates the changes in your SEO process. With this you can check out the back link profile and link analysis. By checking the back links with the Google ratios, you can easily identify which are toxic and which are healthy. If you found toxic links, then create an outreach campaign request to remove those links. 
  • Disavow the toxic links with the help of your account in Google webmaster tools and check the result within two weeks.
  • Penguin penalty is not a manual thing as it is an algorithmic, so send a consideration request to filter your back links profile.
  • Maintain a strong social media marketing which will help to create social signals. 
  • Be careful about all your site ratios and maintain to build your site with high quality back links.
  • With the content marketing, you can achieve a better response with high quality article which changes the content update ratio and can receive compliments from traffic.

Monday, July 1, 2013

Most Valuable Ecommerce Customers Come from Organic Search

The utmost value customers come from organic search more than 54 percent than the customers following the PPC and email to market their sites.

Customer lifetime value (CLV) is a way you can count the entire financial gain of a company can earn by keeping relationship with that customer. For example CLV is calculated by amount customers having been used within two years of their first purchase.

“The knowledgeable marketers in the next generation people of ecommerce will be looking hereafter just where customers are coming from” Custora said in its report. “Now people are looking at the value of new customers obtained across channels, platforms, and geographies.”

Social media is usually consent to receive as a common collaborator in a way to transformation, and at times is the last click or only click in a conversion. However As stated by a study, clients arrival from social networks don’t heap up against the CLV of clients arriving from other channels.

Ecommerce customers obtained through Twitter are about 23 percent less than the average CLV, according to above report.

It’s no wonder, email marketing as an acquisition channel is on the top. We saw this before also in a study that email was beating search and social media as the utmost driver of transformation for ecommerce.

In reality, according to the report, customer acquirement via email has increased over years in ecommerce.

The above image shows the importance of organic search to customer and from email.

The date is collected from 86 U.S. retailers over 14 industries and million customers. Acquisition channels were acquired via Google Analytics.

Importance of 404 Error pages

Whenever web pages are relocated to other pages or else particular webpage may be removed from database then visitors might face an error which is nothing but "404 error".
With an interest of visitors search, if this kind of error occurs then anyone can feel unsatisfactory. So in this situation, web owners should make visitors to satisfy with few tricks for their search.

Here are the few tips to hold visitors to your website when 404 errors occurred.

1. Understand the importance of page:

Whenever visitors find the page is important, if they find this kind of error, obviously they will land on other web pages. This in turn, your site may lose the traffic. In order to hold the visiting traffic, sort out the 404 error pages as quickly as possible or else give a hyperlink to land on other relevant pages of your website.

2. Accept the errors: 

Accept the errors of your website. Don't be irresponsible because it is your site problem and not the visitors or some others. So check out the errors of your site and feel responsible and make the visitors feel to satisfy with your web pages. It’s very crucial about your site traffic and your site importance.

3. Keep relevant page layout:

As there are many pages related to your site but if any one page has error then you are only the looser. So the page layout should be relevant in such a way that it should contain site logo, color of your site and everything should be same. If any irrelevancy is there, then there may be drastic change in your site analytics.

4. Mention sufficient details: 

In order to not lose the traffic to your site mention sufficient details in the error page so that the visitors can navigate to other page instead of searching to other pages. With the useful information on the error page like placing a search bar, menu bar, hyperlinks in header or footer etc can be beneficial to you and to visitors also.

 5. Add emoticons: 

Your site page has occurred with a 404 error, it’s your responsibility to make the visitors feel comfort. You can drag the traffic by adding some attracting emoticons on error page. Design of a webpage increases the conversion rate. So make your error pages with interesting emoticons.

Google announces launch of Dynamic Remarketing for Retailers

Google has been rolled out their own version of re-targeting for Retailers using Google merchant feeds. Now-a-days, re-targeting campaign has become a trend among PPC marketers. So, Google decided to release "Dynamic Remarketing" feature to its Adwords retail customers.
Dynamic remarketing feature enhances a great value of experience for both marketers and consumers because it automatically builds customized ads that connect with customers across the web. In addition, it will be useful for advertisers in a great extent because advertisers adopt new ways in online marketing to reach customers with compelling ads that brings them conversions.
To begin with this new feature you need to have Google merchant account. Moreover, Google decides to expand this new feature to more sectors later this year. With this new feature you can create ads quickly with dozens of templates and from that you can select the design for your brand. In addition, this feature automatically creates dynamic text and shows ads for you.

Google allows you with four options:

1. Past Buyers: This ad displays popular products as well as items that have been purchased previously and this ad is similar to Amazon ads.
2. General Visitors: This ad displays to the visitors who visits the website casually, didn't purchase any product.
3. Product Viewers: This ad is shown to the people who selects the particular item but didn't put it shopping cart. They can view the items that others visitors viewed along with the recommended items.
4. Those who abandoned their shopping carts: This ad displays items that exist in shopping cart along with recommended products.
When users visits the website, the remarketing tag adds them to one of the remarketing list and provides product ID. Adwords uses this Id to bring out the product image, name and price from merchant account and place it in ad.
Google brings out your product information, image, name, and price and displays ads based on items which sell more. In addition, it also displays the items that the shoppers viewed recently. You need to include dynamic remarketing tag after building your dynamic remarketing campaign.
Recently, Google included a remarketing list for search ads, and it is termed as RLSA. It allows you to modify your campaign bids, target keywords and search ads based on previous activity a user has done on your website. With the use of RLSA, you can expand your brand with enrich keywords as well as you can adjust bids to increase the visibility of your website to attract high value customers.

5 Unknown Keyword Research Tips To Boost Your Online Traffic

   In SEO point of view, keyword research plays a prominent role. You need to get right kind of visitors whom you are targeting. Using keyword research you can increase the traffic of your website.
Following are the 5 keyword research tips to boost your online traffic:

1. Swipe Your Competitor's Keyword Research:

   With the help of keyword tool you can do keyword research for your authority competitors.

Use Google Keyword Tool:

   You can choose the top 5 competitors which rank in the top positions for your targeted keywords. You can use keyword research tips for the keyword phrase you would like to target. Place those 5 competitor URLs which are ranking for the keyword in Google Keyword Tool's website box.

   You can choose the language and location where you want to target. You can select keyword match types as exact or phrase match. Phrase matches are helpful in finding the long tail keywords which are hidden. By this method you can view the search volumes for the keywords you choose instantly. You can select the keywords which are relevant to you and group them together.

2. Find Hidden Long Tail Keywords in the Phrase-Match/Exact 

Match Difference:

   You can easily find the search volumes of exact match and phrase match type for a specific keyword. If the difference between the search volumes of both keyword match types is more then you can have a winning keyword. Keyword Tool also shows the related long tail keywords. With the help of Google instant searches you can find the long tail keywords which are missing.

SEO tools (in Google Global Search)
"SEO tools" (Phrase match) = 90,500
[SEO tools] (Exact match) = 33,100
"SEO tools" phrase match/exact match difference = 57,400

SEO services (in Google Global Search)
"SEO services" (Phrase match) = 246,000
[SEO services] (Exact match) = 49,500
"SEO services" phrase match/exact match difference = 196,500

Here the term "SEO services" has bigger difference in the search volumes i.e., 196,500 hidden long tail keywords.

The hidden long tail keywords can include:
"SEO services Canada"
"SEO services Toronto" and so on.

3. Blog on Upcoming Events and Product Launches:

   You need to know the product releases and the industry updates of your competitor. You can generate more traffic by optimizing the keywords which will be searched in the future.

Examples are:

Date Based Searches
Black Friday 2013
Boxing Day 2013
Product launches
[Product name] launch
[Product name] review
[Product name] information
These keywords can be highly searched in the future and easily get ranked in search engines.

4. Add Geo-targeted Search Terms to your Keywords for Local Search:

   You can add geo-location keywords while promoting local business and rank faster than regular keywords.
   For example, if the keyword "SEO services" needs to be optimized it takes months or years. So, it is better to add geo-locations to the keyword.

"SEO services Canada"
"SEO services Toronto"
Though these keywords have low search volumes but can be targeted more.

5. Mine Your Existing Google Analytics Data for Long Tail Keywords that you might have Missed:

   You can list out the long tail keywords by using Google Analytics regular filters.

You can filter from 3 to 7 or more keywords in the keyword phrase. This will be helpful in finding content ideas.

Tips to Make Keywords Fit Marketing Messaging

Keywords are not just used for traffic, rather they are also used as descriptor for the business, it will be a self-identifier but not just restricted to search. Keywords play an important role in remembering a brand or a product. They reflect the work that has done instead of how much they are searched for choosing the right one.

 Keywords are to be outlined in few steps as described below.

Who Are You?

At first whenever we start to develop a site or restructure it with content then we start directly from keyword research instead of having a look at business. But before we like customers to know about us it is important to know how the company is positioned. There is need to be questioned in the following way:
  • If your company is a mobile, how would you describe it?
  • How would you describe company culture?
  • What are the core values of a company?
  • What is your company’s mission and vision?
  • Who do you want to buy from your company?
  • Who are the decision makers and decision influencers?

Thorough Market Research:

Thorough market research is needed by doing keyword research with users instead of just relying on search engines or tools. By just simply surveying in local areas in casual way we can get to know more about how users actually search. If we are asking question like “you really don't know what keywords your users would use to search for your business”. Then follow them up with questions like:
  • How do you find a company for ABC?
  • What would you type into Google to find these companies?
  • If you were looking for advice on ABC, what would you do?
  • What's the most important thing you look for an ABC company's home page?

Informational versus Promotional Searches:

We need to compare the user research results which are the results of the way actually they search with the Keyword tools results. It just gives us an idea of our work. Traffic should not be a stop in this entire process. Secondly we need to think about the query users type because they every time search an answer to a question and it comes with a sort of action. The two biggest actions are:
  • Informational: It means that user is still researching and just wants more information about a topic.
  • Commercial: It means that user is looking for a business and ready to buy.
We will not expect a commercial-based page for an informational-based keyword and it shows that we are failing to reach people at buying process.

At last make use of Google:

Lastly we need to ensure that companies who uses the same keywords should be potential competitors not just search competitors. So there is a need to search rankings of keywords:
We need to check out the following that the company and these share the same:
  • Target audience
  • Services or products
  • Price points
  • Messaging and positioning
Therefore it is needed to ensure that keywords should fit the way we do our business.