Friday, July 23, 2010

Google Videos Best Practices for How to deal with Most Common Problems Create When Crawling and Indexing Video Content

We'd like to highlight three best practices that deal with some of the most common problems create when crawling and indexing video content. These three practices includes ensuring your video URLs are crawlable, stating what countries your videos may be played in, and that if your videos are removed, you obviously point out this state to search engines.

1: Verify your video URLs are crawlable: check your robots.txt

Sometimes video URLs are included by the publishers in the Sitemap which are excluded by robots.txt. Care must be taken that your robots.txt file isn't blocking any of the URLs in your Sitemap which includes URLs for the:
  • Play page
  • Content and player
  • Thumbnail
2: Tell us what countries the video may be played in
  • Is your video only offered in some locales? The elective attribute “restriction” has newly been added.
  • You have a choice of telling search engines where to play your videos and where not to play. With the tag, you have the choice of either including a record of all countries where it can be played, or just telling us the countries where it can't be played. If your videos can be played all over, then you don't need to include this.
3: Indicate clearly when videos are removed -- protect the user experience

Sometimes publishers take videos down but don't signal to search engines that they've done so. This can effect in the search engine's index not precisely reflecting content of the web. Then when users click on a search result, they're taken to a page either representing that the video doesn't live, or to a different video. Viewers find this experience dissatisfy. Although we have mechanisms to notice when search results do no longer exist, we strongly recommend following community principles.

To signal that a video has been removed,
  1. Return a 404 (Not found) HTTP response code, you can still return a cooperative page to be display to your users. Follow these practices to create effective 404 pages.
  2. Indicate expiration dates for every video listed in a Video Sitemap (use the element) or mRSS feed ( tag) submitted to Google.

Monday, July 12, 2010

Site Speed in Web Search Ranking, Free Tools to Evaluate the Speed of a Website, How To Increase Your Site Speed

Site speed is the speed at which a site responds when a visitor hits the URL in the browser. And as this site speed became the hectic thing for Google and to relief this obsession Google included a new signal to search ranking algorithms.

Always site speed is an important factor for a website’s credibility because the visitors or website authors or website owners will feel happy if the speed is faster and moreover internet users love to spend on faster sites than slower responding sites. In Google’s internal study it clearly says that visitors doesn’t show interest or spare more time when a website responds slowly. By improving site speed it not just increases visitor’s interest on our site but also it reduces operating costs. And because of this importance Google took site speed into account as a factor in web search ranking. Google uses various sources in determining speed of a website comparing to other sites.

Free Tools to Evaluate the Speed of a Website:

Page Speed: It is an open source Firefox/Firebug add-on which is used for webmasters or developers to evaluate the performance of a webpage and get suggestions on how to improve them. It can perform several tests on website’s sever configuration and front-end code. Running Page speed on webpage gives a set of scores for each page and useful suggestions on how to improve performance.

YSlow: This is Yahoo free tool for improving webpage speed. Based on some set of rules YSlow analyzes a webpage and suggests various ways on how to improve the performance. Yslow is included in Firebug add-on and YSlow user guide is included in download.
Webpage test: It provides us to test the URL of a webpage. Test can be done from a specified location and it provides a waterfall view of webpage on load performance as well a comparison with an optimization check list. The below image shows chart speed of website experienced by users world-wide.

Google Webmaster Tools: In Labs, Site performance shows the page speed of a website. And there are many other tools provided to evaluate the speed of a website.

Site Speed in Web Search Ranking
As Page speed is a new signal included in webpage ranking it doesn’t carry much importance for the relevance of a page. And currently Google doesn’t carry even 1 % of search queries while considering a factor Site speed into its algorithm. And this only applies for the search done in and English currently. If a website found not going much change in site rankings, it means that the site speed changes haven’t impacted that website.

It’s better to start looking at site speeds for better performances in site ranking in future.

Tuesday, July 6, 2010

Quality Links, How to Get Quality Links, Factors, Quality Back links, Link Building Strategies, Approaches to Get Quality Back Links

Before going into how we get quality links to our website, first start with why quality links to a website? We are having a newly built website with good design and best services but what if no visitors to the site, it just ruins the purpose of website. Link building is the procedure to solve the above issue. And getting quality links to our websites is a best practice for organic link building.

How to get Quality links to our website:

Relevancy: Unique content is always an ultimate way to impress search engines because search engines can only read text, not pictures. All quality websites looks for something in return when they give a link and it doesn’t mean money but expect an informational content or useful services that have a certain relevancy to their website. So submitting for relevant sites will probably a good way of getting quality links to our site.

Register with DMOZ or Yahoo: Registering with DMOZ (Free web directory) or Yahoo (Paid) is not as easy as it sounds because they accept only high quality websites. But once the site enters into these directories it can gain a relevant link and can gain easy access to other linkable websites which are probably of highest quality.

Blogs & Forums: If the site is newly built and unknown, a good way of marketing these type of websites is to get into community discussions like forums or blog. But always be conscious of not to spam or soliciting of our own site. Building a reputation can drive visitors to our site easily, and they will pass on links and chance of visiting your site frequently. This will increase visitors and marketing process becomes easy. Remember good and unique content always attracts visitors and make them recommend.

Target Group: Targeting exact visitors to your site will be an effective way. And think of issues your users might face and finding out solution in advance and publishing them would be highly appreciated by visitors. This type of approach can bring merit-based links and loyal followers who help us in generating direct traffic.

Social-Media: Always entertaining content is favor for visitors to share mainly in social media services. In social media, humor way of approach always an easy and best way to market our website. This is one of the more powerful way of marketing stunts but it’s not recommended to rely on them for long period.

Avoid Exchanging Links: If the visibility of our website is important factor then it’s very important to avoid exchanging links with low-quality websites or buying page rank passing links. And short term growth is not possible in legitimate link building strategies.

Competitor’s sites: It’s always a good practice to follow similar sites in other markets and update the strategies will be another good way of approach.

Saturday, July 3, 2010

404 Error Page, soft 404 Error, Practices, Page Not Found 404, Error 404 Page Not Found

404 Error

A 404 error page will appear when a visitor sends a request for a non-existing page on the site. It may be because of visitor clicked on broken link, page has been moved or deleted or visitor has mistyped a URL. 404 error is called because web server will send HTTP 404 error when a request is send by visitor for a web page if the page does not exists. Depending on ISP 404 error page will vary, it will not provide useful information to visitor and many users may quit from the site. We can solve this by giving useful information to the user providing a good custom 404 error page. We can create 404 error page as standard HTML page.

The following are the steps which can guide visitor of what they are searching for when they come across 404 error page:

  • Tell visitor clearly that page they are searching for cannot be found.
  • Try to have same look and feel of the webpage when a custom 404 error page appears.
  • Make sure that ISP will return actual HTTP 404 error page instead of custom 404 page because there is a chance of Google to crawl custom 404 page and appear in SERPs.
  • Tell Google of your site’s move using Change of Address tool.

Soft 404

We have two types of 404 on the web they are “Hard 404” and “Soft 404” or “Crypto 404”. Hard 404 is the page which appears when a web page does not exist which will be provided by the ISP or Web Server.  Soft 404 occurs when the non-existing page which visitor have requested have been linked to 200 response code. The 200 response code can be site home page url that does not exist or an error page.

This confusion can make search engines to crawl and index a non exist duplicate urls on the site. This may cause crawlers and bots not to visit frequently the exact website. To prevent it try to use a 404 error page and clearly specify that the searching url or information does not exist.

The following are the steps to correct Soft 404:

  • Check soft 404 listed in the website using Webmaster Tools.
  • Check navigation of 404 error page, 301 redirection and 200 response code page is correctly responding or not.
  • Configure properly of HTTP response by Fetch as Googlebot of Webmaster Tools.

Conclusion: Using Webmaster tools, regularly check for Soft 404 and correct it for a website and insist to have a 404 error page provided by ISP or web server.  Check for the navigation of all the pages including 404, 301 and 200 responses of a website in order to not a have a confusion to visitor of a website, crawlers and bots of search engines. By the above steps, we can provide clear and specified information to the visitor and search engines.

Google Sitemap New Features - One File, Many Content Types.

Sitemaps: one file, many content types:

Google has introduced an astonishing feature that we can now submit various content types like image, video, mobile url, code, geo, etc. of our site in a single sitemap file because  now Google supports such  sitemap with different content types.

Google webmaster trends analyst “Jonathan Simon” has explained that the site owners are leveraging sitemaps as they wanted Google to know about their sites. As we know sitemaps are introduced first in 2005, from that time onwards additional specialized sitemap format was introduced for better accommodation of video, images, mobile or geographic content. As number of specialized formats is increasing, it is much easier for sitemap which supports the inclusion of multiple content types in the single sitemap file.

The sitemap structure with multiple content types is said to be similar as standard sitemap which supports additional ability referencing the urls of different content types. The following example shows the sitemap that which refers to standard webpage for web search, image reference for image search, video reference for video search.

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns=""
      <video:title>Grilling tofu for summer</video:title>

 Here we can see it in webmaster tools how the multiple content types are submitted:

Let’s hope that the inclusion of multiple content types in a single sitemap simplifies sitemap submission. The other sitemap rules, like submission of maximum 50,000 urls in single file and file size limit of 10 MB uncompressed is still applied.