Thursday, August 1, 2013

10 Day Tips to Be Safe from Panda Effect

Panda updates are being rolled out every month and these updates will be rolled in a time period of 10 days. It is the great time to avoid panda effect on our site and now we get an opportunity to go through how is the algorithm being rolled out in these ten days. Using few tips, tricks and warnings will help out in analyzing and checking risks connected with Panda algorithm updates.


Here are the sections that are involved in 10 days of Panda rollout.

Day 1: Deals with Content - Content is King:

We know that there will be millions of pages in search engines data base. So search engine spiders need to crawl, index, classify and then arrange them accordingly with their particular algorithms on the basis of keywords which should match up with user queries upon search.   

                
As it is well known that “Content is King” and Panda update follows it very particularly. It filters content value through its specific algorithm which deals with decision of both human and search engines.

Day 2: Deals with Data in deep consideration - Feed the Beast:

Even though there are lots of news, views and distraction of data from terms of service, but the very first thing is to decide whether our site is “Panda Worthy” or “Panda pounce”. It can be done by checking authority or influence of data through Google Analytics. The first two sections are:

1. To analyze Traffic data:

At this point we need to analyze which areas, pages or sections of our site is not visited much or not at all visited. For that we need to categorize into two sections again:
  • Viewing from traffic point what are the bottom 10 percent of landing pages that are in the pages with less than 2 entry visits.
  • According to traffic what are the bottom 10 percent of site pages that are in general pages that have less than 2 visits.       
 
           In Google Analytics to check these pages the path is Traffic Sources -> Search -> Organic         

2.  To analyze Landing page data:

At this point, we need to analyze which areas or pages or sections of our site that do not drive visits at all. For that we need to categorize into two sections again:
  • First we need to know how many number of pages from our site are indexed which can be checked through site:Domain-name.com or else with help of webmaster tools.
  • We need to identify the difference between the number of pages that actually were indexed and the number of organic landing pages that are indexed. These often lead to an issue of rank-worthiness in specific areas or page templates or types. 
  •  Check bounce rates and analyze whether there are any specific landing pages which have high bounce rate.
  • Then check out the bottom 10 percent of pages which have highest bounce rate and lowest average time a visitor stays on a page.

Day 3 need to be dealt with users Data Engagement - Give That User a Cigar:

Third days deals with two sections of data in Google Analytics which specifically deals with data that users are engaged with.

 

 3. To analyze Engagement Data:

At this point we need to check whether user is going through multiple pages in our site and through correct path. Is he satisfied with these pages and how much amount a user is spending on these pages? So we need to check the following sections of Analytics:

Primary paths that are gone through visitors in the site:


Visitors time duration on the page:

 4. To analyze Social Data:

There will always be a different opinion on social and effect of social media on SEO. So we need to analyze it as if Google is giving us some data which means:
  • Google thinks that data as important to it.
  • We need to use it as KPI when we understand that it is relevant to our goals.
Here are some important sections in Analytics that give us idea about our site’s social data:

Content that is visible in offsite:


Social Plug-ins Report which gives information of Social sources:

       

Day 4 will deal with avoiding surprises - Catalog your Content:

 It is that point of time where we need to keep track of old blogs, long product pages that are not in live, or the image galleries. Image galleries should have right of being displayed in Smithsonian which are amazing in tech, marketing fields along with C-level folks.
                     
  After the data is collected, we need to catalog over lap content, content themes and then a review has to be prepared for every page based on the following questions:
  • Whether the content on this page is interesting?
  • Is it arranged next to our topic expertise?
  • Content of this page is old, tired or less relevant?
  • Does this content add any value to our site users?
By analyzing this, we can provide better content to our users as well as we can do better during Panda reviews.

Day 5 has to be dealt with digging duplicate content:

At this point of time we need to check with duplicate content on pages with other sites and also pages of our site itself. To start checking with duplicate content on pages it has to be started with Title tags, Meta description and then comes content on the page. To check duplicate content with other sites select one or two unique sentences in page and with the following operator check in Google.
              site:www..xyz.com "quoted text"


                     Even though everything will not be cached but this query will highlight duplicates with Google index of our site and will help out in duplicate search. Example of duplicate content search:


        

Day 6 has to be spent with analyzing Thin Content:

Thin content does not add any value to the page and if content present on page is spinning content then also it leads to thin content issues. Thin content leads to less user engagement, high bounce rates, lack of sharing and other attributes that help content in adding value to the page. Panda targets pages with very less content, single pages and this result in whole site visibility. 

                        
Sections of site which generally contribute to thin content are stores that are geo-based, similar products with different color or sizes and in-case of job descriptions there will be a little change and that may be region, color or specialization.

Day 7 deals with Old or Tired content:

If there are any pages with content that is out of date, apparently wrong, or if it pays attention back to early days of internet and does not give any historical details then this is the correct time to remove that content. 

                
In case, where old content still has high demand but it cannot be found easily then it has to be planned in such a way to rotate old content on to the pages that are visible. It can be done using techniques like internal linking modules, easily accessible archive pages and through social promotion. To check whether still old pages has traffic we need to implement rotation of home page content archive links.

                Day 8 deals with analyzing black hat techniques - Devil made us to do it:
As we know that spam techniques will lead to penalization of site by Panda. Now it is the time for not being dependent on these techniques. Also it is better to remove links from spam sites.


                     
Even link networks are being taken down and each single minute there are ‘n’ numbers of sites being taken offline.

Day 9 is the time to take a second review - Looking Good!

As we cleaned up the entire site now it is the time to take a second review and also to look into webmaster tools to make sure that there are no more errors, issues and warnings. 


                  Then review analytics for these sections site visibility, key metrics and baseline where is our site today.



It is the best time to plan short term goals about our site position in search results and also need to build Panda-friendly content strategy.

Day 10 is the time to relax as site is Panda - Proof:

Now as we have completed with Panda proof for the site and also short term goals we can be away from site for few days.


                   

Ongoing: Don’t fall asleep at the Wheel:

As per now we are relieved from Panda updates of past and also future but there may be other updates. So every time there is a need for us to be alert.                      

                 

No comments:

Post a Comment