Showing posts with label SEO_Articles. Show all posts
Showing posts with label SEO_Articles. Show all posts

Tuesday, January 29, 2013

8 Reasons Why Your Site Might Not Get Indexed

When the new website is launched, you should first think of whether the search engines have indexed your site or not. If they don't then your site may not appear in search results position.
So, let’s check about it.

Identifying Crawling Problems 

The very first step of identifying is to type in search box (site: example.com) and check how many pages you have and how many of them were indexed. Comparing with your actual pages, if Google is showing low amount of pages then your site is in problem.
Then the second thing is to look at Google webmaster tools report. If there are any crawl errors in the site then it displays in the dashboard. The 404 error which shows a page link cannot be found. If there are 301 errors in your site, then the site is not projected to your visitors.

Fixing Crawling Errors

Here are some types of Issues caused by the following reasons:

Robots.Txt

The Robot.txt is a text file in websites which allows search engine spiders/bots to crawl or index a website. It appears in the root of your website. The robots.txt file works for specific bots to allow or hide the directories or files within the robots.txt file.

Structure of Robots.txt file

User Agent: *
Disallow

.htaccess

It is the configuration file used in web servers to control features of the server. The .htaccess file controls and redirect the pages easily. It gives control on your site, and keeps the data saved for htaccess file. So in this way you can recover from a site crash.

Meta Tags:

Meta Tags are special set of HTML tags that contains information about the webpage and its contents. Search engine robots are programmed to read meta tags regarding the web page. The main meta tags that spiders read are description and keyword meta tag. If the page is not getting indexed, check the meta tags in the source code.

Site Maps:

Sitemap is an XML file. It is a structure of the links of website through which a visitor can navigate to any part of website. Try to place the site map on all important pages of the website including the homepage so that it can be easily detected by the search engines and more Google Page Rank can flow to the sitemap.

Poor Internal Linking Structure:

This is very rare case but one must be aware of poor internal linking structure which can cause indexing problems. A good internal linking structure gives success to the website.

Page Rank

Number of Pages Google crawl is related to your page rank.

Connectivity or DNS Issues:

Whenever the site is under maintenance or moved to a new home, in such cases the DNS delegation of crawler access takes place.

Keyword Rank is Still Important


When you perform a search on the search engines, keywords are the key roles. 25% of the search algorithm depends on keywords only. So for a smart business the owner must be able to choose the keywords which are valuable to his/her business.

An idea to separate the keywords by their ranking is important for a business. Every page should contain the relevant terms that will provide the search engine traffic. Depending on those keywords your page will be displayed in the search results.

Google Adwords is the best tool to find the targeting keywords. By using this keyword tool you can easily determine the keywords which you have to target.

Important factors to keep in mind while you are doing keyword research:

Relevancy: The keyword must be relevant to your service. This will provide the right people to click through your website.

Traffic: You have to choose a keyword that most people enter into the search engines. It generates good traffic.

Competition: Focus on the keywords of other sites which are ranking better in search engine result pages.

Commerciality: Choose the keyword which is best explained by Market Samurai when considered keyword rank.

High commerciality keywords generate large amount of profits than low commerciality keywords.

It is better to write content for a website after the keyword research to improve the keyword rank; otherwise you may or may not improve the rank. Check your keyword rank by using Google analytics account.

Anchor Text Optimization

Anchor text is a hyperlink which directs the webpage from the location to another webpage. It is a visible text in blue colored with underline characters. Usually anchor text is a group of characters or words or an image which is a clickable text.
For example: In a sentence if Google is an anchor text, when a user clicks on it then it directs to https://www.google.co.in/.

In HTML coding, hyperlinks can be created by using href attributes. Href refers to hyper text reference which consists of link location. Syntax for anchor text is

 

Anchor text optimization: It is one of the major seo tools which can rank a page in better position by using rich keywords in a specific hyperlink. In this optimization, the content in the landing page should be more relevant to the keyword or phrase in the anchor text so that search engines give importance to it thereby rank position varies to better position.

Anchor text optimization can be done in two ways, internally and externally. Internal optimization refers to optimizing of anchor texts which are linking to a particular page can be within a website from other pages. Whereas, external optimization is optimizing of anchor texts in inbound links which are linking back to the website.

The hyperlink helps users to get the relevant content pointing to other web page for their search. Anchor text is one of the major seo tools that is helpful to rank a page by search engines. Sometimes anchor texts are misused instead of landing to right page may direct to some other irrelevant page that is not usable to user. This leads to rank lower to that home page and in turn traffic may also decrease.

To improve page ranking, you can create a few number of deep links to home page with hyperlinks. Use relevant keywords and phrases in anchor text so that a reader can understand easily. Use short description in anchor text instead of lengthy description. Use different synonyms instead of one for particular keyword. Quality back links improve page rank to better position by search engines.

Friday, July 23, 2010

Google Videos Best Practices for How to deal with Most Common Problems Create When Crawling and Indexing Video Content

We'd like to highlight three best practices that deal with some of the most common problems create when crawling and indexing video content. These three practices includes ensuring your video URLs are crawlable, stating what countries your videos may be played in, and that if your videos are removed, you obviously point out this state to search engines.

1: Verify your video URLs are crawlable: check your robots.txt

Sometimes video URLs are included by the publishers in the Sitemap which are excluded by robots.txt. Care must be taken that your robots.txt file isn't blocking any of the URLs in your Sitemap which includes URLs for the:
  • Play page
  • Content and player
  • Thumbnail
2: Tell us what countries the video may be played in
  • Is your video only offered in some locales? The elective attribute “restriction” has newly been added.
  • You have a choice of telling search engines where to play your videos and where not to play. With the tag, you have the choice of either including a record of all countries where it can be played, or just telling us the countries where it can't be played. If your videos can be played all over, then you don't need to include this.
3: Indicate clearly when videos are removed -- protect the user experience

Sometimes publishers take videos down but don't signal to search engines that they've done so. This can effect in the search engine's index not precisely reflecting content of the web. Then when users click on a search result, they're taken to a page either representing that the video doesn't live, or to a different video. Viewers find this experience dissatisfy. Although we have mechanisms to notice when search results do no longer exist, we strongly recommend following community principles.

To signal that a video has been removed,
  1. Return a 404 (Not found) HTTP response code, you can still return a cooperative page to be display to your users. Follow these practices to create effective 404 pages.
  2. Indicate expiration dates for every video listed in a Video Sitemap (use the element) or mRSS feed ( tag) submitted to Google.

Saturday, July 3, 2010

404 Error Page, soft 404 Error, Practices, Page Not Found 404, Error 404 Page Not Found

404 Error

A 404 error page will appear when a visitor sends a request for a non-existing page on the site. It may be because of visitor clicked on broken link, page has been moved or deleted or visitor has mistyped a URL. 404 error is called because web server will send HTTP 404 error when a request is send by visitor for a web page if the page does not exists. Depending on ISP 404 error page will vary, it will not provide useful information to visitor and many users may quit from the site. We can solve this by giving useful information to the user providing a good custom 404 error page. We can create 404 error page as standard HTML page.

The following are the steps which can guide visitor of what they are searching for when they come across 404 error page:

  • Tell visitor clearly that page they are searching for cannot be found.
  • Try to have same look and feel of the webpage when a custom 404 error page appears.
  • Make sure that ISP will return actual HTTP 404 error page instead of custom 404 page because there is a chance of Google to crawl custom 404 page and appear in SERPs.
  • Tell Google of your site’s move using Change of Address tool.

Soft 404

We have two types of 404 on the web they are “Hard 404” and “Soft 404” or “Crypto 404”. Hard 404 is the page which appears when a web page does not exist which will be provided by the ISP or Web Server.  Soft 404 occurs when the non-existing page which visitor have requested have been linked to 200 response code. The 200 response code can be site home page url that does not exist or an error page.

This confusion can make search engines to crawl and index a non exist duplicate urls on the site. This may cause crawlers and bots not to visit frequently the exact website. To prevent it try to use a 404 error page and clearly specify that the searching url or information does not exist.

The following are the steps to correct Soft 404:

  • Check soft 404 listed in the website using Webmaster Tools.
  • Check navigation of 404 error page, 301 redirection and 200 response code page is correctly responding or not.
  • Configure properly of HTTP response by Fetch as Googlebot of Webmaster Tools.

Conclusion: Using Webmaster tools, regularly check for Soft 404 and correct it for a website and insist to have a 404 error page provided by ISP or web server.  Check for the navigation of all the pages including 404, 301 and 200 responses of a website in order to not a have a confusion to visitor of a website, crawlers and bots of search engines. By the above steps, we can provide clear and specified information to the visitor and search engines.

Google Sitemap New Features - One File, Many Content Types.

Sitemaps: one file, many content types:

Google has introduced an astonishing feature that we can now submit various content types like image, video, mobile url, code, geo, etc. of our site in a single sitemap file because  now Google supports such  sitemap with different content types.

Google webmaster trends analyst “Jonathan Simon” has explained that the site owners are leveraging sitemaps as they wanted Google to know about their sites. As we know sitemaps are introduced first in 2005, from that time onwards additional specialized sitemap format was introduced for better accommodation of video, images, mobile or geographic content. As number of specialized formats is increasing, it is much easier for sitemap which supports the inclusion of multiple content types in the single sitemap file.

The sitemap structure with multiple content types is said to be similar as standard sitemap which supports additional ability referencing the urls of different content types. The following example shows the sitemap that which refers to standard webpage for web search, image reference for image search, video reference for video search.

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
        xmlns:image="http://www.sitemaps.org/schemas/sitemap-image/1.1"
        xmlns:video="http://www.sitemaps.org/schemas/sitemap-video/1.1">
  <url>
    <loc>http://www.example.com/foo.html</loc>
    <image:image>
       <image:loc>http://example.com/image.jpg</image:loc>
    </image:image>
    <video:video>
    <video:content_loc>http://www.example.com/videoABC.flv</video:content_loc>   
      <video:title>Grilling tofu for summer</video:title>
    </video:video>
  </url>
</urlset>

 Here we can see it in webmaster tools how the multiple content types are submitted:

Let’s hope that the inclusion of multiple content types in a single sitemap simplifies sitemap submission. The other sitemap rules, like submission of maximum 50,000 urls in single file and file size limit of 10 MB uncompressed is still applied.

Tuesday, June 29, 2010

Google’s Speedy Enhancements Save You Time and Keystrokes

Google’s Speedy Enhancements Save You Time and Keystrokes

Speed is a general subject at Google. The idea and the aim of speed are baked into all departments of Google products, starting from Google Chrome to Google web search. Frequently, Google also simply develop features that help out to respond quickly for all the query terms. Whether by bringing the precise content that we look for by displaying it at the top of the search results page or by optimizing the method you search. Several of these speed enhancements save you time and keystrokes.

Two New Features Of Google This week

Sunrise and Sunset Search feature

The Sunrise and Sunset Search feature can be helpful forever to know exactly when you can look to get the best time for a morning jog or while arranging the best and exact moment for a wedding proposal. This week, Google is very much ready to launch a Sunrise and Sunset feature for search. The Sunrise and Sunset Search feature presents you the precise times of sunsets and sunrises for any location in the world. In contrast to the weather, sunrises and sunsets are fairly predictable, and accordingly, Google don't use a data source. Instead it relies on latitude, longitude and the current time and it calculates sunrise and sunset times. This calculation has been of attention to mathematicians and astronomers for millennia, thus they’ve had time to get it just right. And for the most of the locations, it’s accurate to within a single minute.

Google Search by Voice & more languages

Google Search by people’s Voice enables to search the web more rapidly than ever before by getting your responses with smaller number keystrokes. Initially, this Google Search by Voice service was launched in English, and was offered in the U.S., U.K., India, Australia and New Zealand. Later Google introduced this Google Search by Voice service in Japanese and Mandarin and expanded it to the number of possible users. Just a week ago, Google launched the service German, in French, Italian and Spanish. Known that local languages are major aspects in the performance of speech recognition. Google first launched this service in the four countries mainly closely linked with Spanish, French and German languages. This week Google focused on Korean and the launch in Taiwan of Traditional Mandarin.

How to start Google Search by Voice Feature

By visiting the Google mobile page of our country's domain (e.g.  In France go to m.google.fr) we can download the application to our phone’s operating system in our locale. This Google Search by Voice feature will be available for Android, iPhone and Blackberry phones. Eventually, Google’s goal is to bring Google Search by voice feature to speakers of all languages, hence keep on reading for more announcements.

SEO’s Magic Bullet, What is SEO’s Magic Bullet, Top Five Ranking Factors in Search Engines

There is big contradiction between many SEO’s when a question of Magic bullet comes, few believe the theory of magic bullet exists and few don’t. Well, it’s hard to change one’s belief in the platform of SEO. Let’s leave the existence of magic bullet theory but see what the theory saying and whether points benefit our SEO techniques or not.

Firstly, what is SEO’s Magic Bullet? It’s a process of solving or a magical solution to perplexity problem without any side effects or to say in simple words getting results with a minimum of efforts.  But to a fact there are no big secrets behind SEO.

Blogging is one such magic bullet. It’s nothing new word we heard but yes, there are many benefits behind blogging and can be considered as Magic bullet. Let me explain.

Few months ago, SEOmoz.org released their Search Engine Ranking Factors-2009, which are something like this.

Top five ranking factors

  • Keyword Focused Anchor Text from External Links – 73% (Very High Importance)
  • External Link Popularity - 71% (Very High Importance)
  • Diversity of Link Sources – 67% (Very High Importance)
  • Keyword Use Anywhere in the Title Tag – 66% (Very High Importance)
  • Link Distance from Trusted Domains – 66% (Very High Importance)

According to above statistics we can understand the importance of keyword ranking. And other keyword specific ranking factor is On-Page. See this.

In Title Tag: Importance for the keyword in the title tag is 66% and considered to be very high. Blogging provides an option of generating new webpages continuously, all with keyword-rich title tags. Generally in blogging title tag is taken from headlines and following SEO practices means already integrating keywords anyway.

First word of Title Tag: Importance for keyword using as first word of title tag is very high and of 63%. The very first word always attracts the reader and creates interest. So by positioning the keyword in the beginning always benefits. And choose the keyword keeping first the reader in mind then optimizations.

In Root Domain Name: Importance for keyword using in root domain name is of 60% and considered to be very high. If it is to boost ranking company name through blogging, attach the blog to your sub-directory or sub domain. (Ex. bipsum.com/blog or blog.bipsum.com)

Page-Specific Link Popularity Factors for Ranking:

Keyword-Focused Anchor Text: Very High-Importance of 73%. When we create links, others will use our titles or headlines and this can be another reason for placing keywords in headlines. And many visitors may attract to the anchor text keywords we use.

External Link Popularity: Very High-Importance of 71%. Sharing blog through good social media channels like twitter, face book, digg etc. creates a chance of more quality links and quantity too.

Link Re-Working is also considered to be one of magic bullet. Link reworking is a process of modifications of links from the sites which are already linking to our site.

From the examples explained above, magic bullets main aim is to how rapidly we rank in the top of Google search result. Apart from the above points the main and important things to remember is that content is king and Back links are considerable factors of SEO’s magic bullet.

SEO Link Building Guide Link Building Guidelines Link Building Info Link Building Strategies

Link building is the good way to be successful on the web and to rank high in the search engines. You have to be popular towards customers to know them what are you offering. So we need to build links of our site on the web. Link building is having different strategies by which we can increase links of a site on the web. Link building is very beneficial for search engine optimization.

Search engines will give high priority for backlinks to a site, which will be an advantage for rankings on SERPs. Building up backlinks on the web is not an easy way for a website, to get the best results for your business. The main target is to get good inbound links for a site. Links are the important factor which helps search engine in determining the level of popularity of a site on the web. If you have more links from other sites to your site then your site will be ranked highly by all major search engines. There are some important things that you must consider while exchanging links with any other site. Ensure that the site is complete, and have a good content for a website to rank high.

Simple way to do link building is to type the keywords on which you want to promote your site and see the results that come up. Browse the sites that are in the SERP and try to exchange link with them. Another important thing to keep in mind to have link exchange with higher page ranked websites.
Link Building can be obtained by the following ways

  • One way Link Building.
  • Two way Link Building (Reciprocal Link Building)
  • Three Way Link Building.

Try to get backlink from high page ranked and natural websites which will be more benefit for high ranking by search engines. We can have chance of fast indexing of a website. Get backlinks from the websites which are often have fresh content in their web pages. Gather more backlinks from news websites and other websites which will often change their content which will benefit our website to have high page rank.

Link Building and Link Exchange can be done for Free Directories, Paid Directories and Websites.

Summary: Links are very important factor that helps search engine in determining the level of popularity of any site on the web. To increase page rank and have popularity of the website we can perform the above link building strategies.

Impact of Social Bookmarking on Search Engine Optimization

Social Bookmarking:

Social bookmarking is considered as the process useful to internet users that they can share, organize, search and manage bookmarks of webusers. It is also said to be a method of tagging a website and saving it for later i.e instead of saving them to web browser, you can save it on web. As bookmarks are online we can easily share them with our friends. If we want to create our own socialbookmarks, we have to register in the social bookmarking sites. It can be a easy root or method to market our website.

Process Of Social Bookmarking:

Social bookmarking is becoming requisite to take reliable steps to increase number of visitors on daily basis to our website. Most of the bookmarking sites allow us to browse through the items based on most popular, recently added, or belonging any category like technology, shopping, politics, blogging, news, sports etc.

Steps to be taken in the process of social bookmarking:

  • Choose the bookmarking site in which we are going to tag a site to it.
  • Register with the chosen bookmarking site.
  • Install browser extensions or bookmarking toolbars as some of the sites lists few ways that you can make social bookmarking easier such as IE toolbar, firefox extensions.
  • Add bookmarks by using the tools that which have been installed earlier.
  • Invite your friends as social bookmarking is meant to be social.
  • Social Bookmarking is used to expand its vision to not only to your friends’ bookmarks but also to other people who tag with same keyword.

This can be simply defined as method bookmarking which is posted online, catalogued with comments and tags, and shared with your friends/others.

Advantages And Disadvantages Of Social Bookmarking:

Advantages:

  • The main advantage of the social bookmarking is “Search Engine Optimization” because these sites are highly search engine optimized.
  • By using bookmarking services you can experience immediate traffic to your site.
  • The main theme of these services is that we can place our favorite websites list in single place and browse whenever and wherever required by using the internet.
  • It ranks the resource based on the number of times it has been bookmarked by users rather than rank it according to number of external links.
  • It is considered as the best place for promoting your site.

Distavantages:

  • Informal vocabularies is one of the drawback of social bookmarking in search data point of view i.e
  • No standard set of keywords
  • Mistagging occur due to spelling errors
  • Tags which have more than one meaning
  • It is not designed with the proper hierarchy of relationships between tags.
  • Spam and corruption is another disadvantage of social bookmarking. The more ofen the webpage is submitted the better chance that the page is found.

spamming occurs as the same webpage is bookmarked multiple times and/or by tagging each webpage of the website by using a lot of popular tags, obliging developers to constantly adjust their security to overcome abuses.

Impact Of Social Bookmarking On Search Engine Optimization:

Social Bookmarking is becoming best way for the website owner's to draw the attention to their site. It exposes the site to a wide group of people who run their own blogs indexed in search engines as regular website are. The following are the few points to explain impact of social bookmarking on search engine optimization:

  • Indexing of sites are done faster as the sites are bookmarked fastely by humans.
  • Brings the additional traffic to the site. So that the page rank of the site will increase.
  • Links will be picked by other websites through social bookmarking site.
  • Defines the quality of particular site and differentiates it from other web sites.
  • The quality of the site can be measured always. As more number of people bookmark a site the better quality and relevance will come to know.
  • Categorizes the bookmarks using tags or keywords which are useful later when we are trying to find a specific bookmark or similar bookmarks.

Friday, May 7, 2010

Google Search Results New Look, Change in the Google Search Results

Google has changed the Look and Feel of the search Engine page with more sophisticated features. Google has been experimenting since last few months by inserting a “Show Search Tools” option in the left navigation of the home page. With the positive feedback and after several experiments, today Google has increased the power of search which showcases the latest search results and terminology by including the left-hand navigation in the search results page.

This new tools increases the relevance of the search query with extra search tools. In the past, Google introduced Universal Search, the Search Options panel and Google Squared. In Universal search, the relevance of the query will be more. It has included “Everything” options in the left pane which helps you to search different a type of query’s which allows to you switch to specific type of result you are looking.

The Search options panel increased the richness of search results. The new left-hand navigation has various search tools like images, discussions, books, updates in which you can use the relevant tools of your search query. 

Google Squared is used to compare the entities and ideas. “Something Different” is a feature which was built on Google Squared feature. This is not only used to find the search results but also to find the other entities, namely images, books and videos etc , which are specific to the user query.

Along with this, Google also changed its Logo features. Google removed the shade of the logo image which was present in the lower side of the Logo.

Finally, Google with its new features has increased the richness of the search and to the point what we are looking for.  New left-hand navigation bar would be a search feast for the visitors, bloggers, and webmasters.  Compare the previous and the present search results bar in the below figure.

Advantages of New Google Search

  • Left Navigation menu bar increases the richness of the search.
  • Left navigation not only increases the relevance of the query but also displays related topics
  • Left navigation will suggest the tools that are most relevant and helpful to your query.

Friday, March 6, 2009

Guaranted Natural SEO Methods

Natural SEO Methods Guarantee Your Success on the WebGuaranted Natural SEO Methods

Natural SEO are the simple usual methods that are used for promoting a site on the web. The web is populated with many sites that offer various kinds of services to clients. Visitors, at all times watch for information and if we are able to manage to reach out to such visitors, we can surely do good business as this is accurately what all online business owners intend to do all the way through their online business sites.
If we want to succeed in this area of business, the first most thing is to understand the search engines. Search engines have definite criteria and rules, plus if a site has met those necessities search engines accept those sites very well. It is important to observe what search engines like to see as well as what they do not like to see on a site. Our site is successful if and only if we follow all these organic necessaries.
Search engines alter their algorithms regularly. There are numerous reasons behind this, in addition to the mainly clear among them is the fact that they wish to improve their results. The results are done after building a study of the users who are dynamic on the web. Natural search engine optimization policies engage using all the legal and white hat methods for the optimization of a site on the web. If we follow these legal methods, we can escape from the legal actions in future.
It is impossible for human to search engines to crawl the many number of websites on the web. Thus it is not human being relatively automated software called spiders or bots which is used to crawl the sites. The significance and importance of a site is estimated depending on the outcome of the crawling carried out by the software. As search engines continuously changing their algorithm and criteria for issuing rankings to sites, we must be completely conscious of what is required to attract the concentration of all the most important search engines.
There are numerous methods and techniques which are used to optimize our site on the web. The suggestion is to say to everybody that we have entered the market and are involved in doing serious business on the web. Finally, it is our call to ensure what policies we sense will be the most excellent for our site. Nevertheless, it can be securely said that organic SEO are one of the best policies that can be intended for the endorsement of our online business site.
Natural SEO is nothing but the entire legal and white hat policies that are used for popularizing an online business site on the web. The Suggestion is to inform everyone about the business.

Friday, February 20, 2009

What is Plagiarism and Tips to Preventing

Tips to Preventing

What is Plagiarism and Tips to Preventing Plagiarism



Plagiarism is defined as reproduction of the language and thoughts of another author and representation of them as one's own original work. When people use others thoughts and posing as their own ideas, it is known as the offensive act of plagiarism.

Useful Tips to prevent Plagiarism as follows:

  • Give evidence how you have clear understanding by reading relevant materials.
  • Always give references to your ideas from where and which sources you got your ideas.
  • There should be evaluation of your ideas by comparing ideas of other author
  • Explain the nature of the exercise clearly to students.
  • Specify particular books or websites that must be used as some of the source materials
  • Require the submission of a rough draft.
  • Require an oral presentation on the same topic.
  • Educate students about good note-taking styles.
  • Model good practice in course handouts and other material.
  • Educate about appropriate referencing.
  • Educate about the boundaries of acceptable practice.
  • Stress that writing exercises are not just about the final product but about the skills developed along the way.
  • Carry out some meta-learning activity such as asking students how they found their sources.

Some people always appoint themselves in paraphrasing rather than creative writing. Paraphrasing is a conversion of one's ideas into your own words. Many people will change the entire sentence and words with synonyms and such kind of changes in phraseology is also taken as paraphrasing. If the people are trying to rewrite sentence by changing some words to their synonyms it would be considered as an act of plagiarism.

Thursday, February 12, 2009

Fake PR (Page Rank) Websites and Link Exchange

Fake PR (Page Rank) Websites and Link Exchange

Fake Page Rank

I got a message for link exchange with a PR5 with any PR3+ link exchange in return. You might be getting such requests so; I thought to bring this notice to you.

This message sounds irregular if anyone is requesting for link exchange for less page rank compare to there existing website. Mostly, such people if ever ask for low PR websites.

If the website has page rank 5 and contains very less back links, hardly any content and no life then this is one of the cases of having FAKE Page Rank.

A fake page rank website can earn a lot of money if you are smart enough to hide it properly. But there is always a solution for every problem and similarly they are always ways to detect a web site that has a fake page rank.

At present I don’t have any idea that Google has taken any serious action against such web sites or plans to take later on but at present I will explain tips to detect such web sites before getting into any kind of deal (link exchanges, buying links etc) with them.

In this article, I want to share my views to you, how to know that a website having a fake page rank or not. This article or post may help you to save you a lot of money.

Four Steps to Detect a Fake Page Rank:

  1. Check PageRank using a PageRank checker tool.
  2. Search: info:http://URL in Google
  3. Look at the URL listed at the bottom of the first result in green
  4. If the URL is different then the one that you put in first then it is a FAKE PageRank.
  5. Open Google.com and search for info:http://url.com. then you can notice that as, the enter URL in green is different from display URL. This means that the PageRank is fake.

Wednesday, February 11, 2009

Importance of Email id for Directory Submission

Importance of Email id for Directory Submission

A process of submitting a URL to various free seo directories and paid directories either by manually or by automatic software. We can find in all directories submission form has email column which is compulsory in all cases. So if we have email ID and that to domain email id is good for submission because editor have confident that submission is genuine.

Why it should be separate valid email the reason are:

  • To avoid spamming in submission to directories
  • At the time of listing a website, some directories publish your Email ID and there are more chances that your email will be in spammer list.
  • After submissions of website to particular directory, we get auto response for submission thanks and followed by other mails which will fill your Inbox.
  • Some web directories send confirmation email to verify the authentication of submission and separator account is helpful for staff or outside Submission Company to handle such verification.
  • In some cases few web directories require account creation process and then verification link or email for web directory to make a conformation of submission.
  • Use for communication from directory editor regarding approval or rejection of your site.

So you must have separate valid email ID either of your own domain or on some free email provider i.e Yahoo, Gmail, Hotmail etc.

Google Earth

Google Earth

Google Earth (“From Space to Your Face”) is a geographic information program, Google Earth an imaginary based mapping program. Google earth is virtual globe and Google earth is map. The original name of Google earth was “Earth Viewers”.

Google earth was created by Keyhole Inc. Keyhole Inc was the company acquired by Google in 2004. Google earth provides the images, photos of earth taken through satellite.

Google earth takes the images and photos of the earth using following ways

  • Satellite Imagery
  • Aerial Photography
  • GIS 3D globe

Google earth available in three version, they are as follows

  • Google Earth – with limited functionality and a free version
  • Google Earth Plus
  • Google Earth Pro is used for commercial purpose.

Advantages of Google Earth

  • Helps in finding Geographic information
  • Using Google earth we can explore places on Earth
  • Using street view version we can tour distant cities.
  • We can view photo realistic 3D buildings

Features of Google Earth

Google Earth was developed by “Google”.
Google Earth release name “Earthviewer”, present name “Google Earth”.
Operating systems on which Google earth works

  • Windows 2000
  • XP & Vista
  • Mac OS X
  • iPhone OS
  • Linux

Browsers in which Google Earth works

  • Firefox
  • Safari 3
  • IE6
  • IE7

Size of Google Earth “10 MB”.

Google Earth available in 41 following languages.

  • Arabic
  • Bulgaian
  • Catalan
  • Chinese (Traditional)
  • Chinese (Simplified)
  • Croatian
  • Czech
  • Danish
  • Dutch
  • English (American)
  • English (Britain)
  • Filipino
  • Finnish
  • French
  • German
  • Greek
  • Hebrew
  • Hindi
  • Hungarian
  • Indonesian
  • Italian
  • Japanese
  • Korean
  • Latvian
  • Lithuanian
  • Norwegian
  • Polish
  • Portuguese
  • Portuguese (Brazil)
  • Romanian
  • Russian
  • Serbian
  • Slovak
  • Slovenian
  • Spanish
  • Spanish (Latin America)
  • Swedish
  • Thai
  • Turkish
  • Ukranian
  • Vietnamese

Google Earth Versions

  • Keyhole Earthviewer 1.0 - June 11th, 2001
  • Keyhole Earthviewer 1.4 - 2002
  • Keyhole Earthviewer 1.6 - February 2003
  • Keyhole LT 1.7.1 - August 26th, 2003
  • Keyhole NV 1.7.2 - October 16th, 2003
  • Keyhole 2.2 - August 19th, 2004
  • Google Earth 3.0 - June 2005
  • Google Earth 4.0 - June 11th, 2006
  • Google Earth 4.1 - May 9th, 2007
  • Google Earth 4.2 - August 23rd, 2007
  • Google Earth 4.3 - April 15th, 2008
  • Google Earth 5.0 - February 2nd, 2009

Tuesday, February 10, 2009

Top 15 email Marketing Tips and Tricks

Top 15 email Marketing Tips and Tricks

1. Keep it simple and short.
2. Let users know what the email is about.
3. Use low resolution graphics to capture their attention
4. Have a call-to-action.
5. Personalize it.
6. Your corporate identity should drive the look and feel.
7. Use the proper heading and sub-heading in template.
8. Underscore links and make them in a different color.
9. Avoid too many colors. They tend to fight each other.
10. Highlight call-to-action so users will focus on it.
11. Emphasize other content carefully.
12. Offer useful information
13. Use the content related to the email and useful for visitors.
14. Use the similar newsletter design template for at least 4 months at a time to advantage from brand awareness.
15. Target your best leads, email to a select few to get a better response
16. For an enhanced response, connect your content to events and mailings that are taking place.
17. Use similar font for all the copy, bolds and italics to headers and proper names.

Saturday, February 7, 2009

Basic SEO Web Services

Basic SEO Web ServicesBasic SEO Web Services

Here is a list of the basic web services for webmasters. I have listed the must-see tools and services and excluded everything else.

Search Engine Services:

Google Webmaster Central: In this is Google webmaster central, you have all tools to manage your site's ranking in Google. Google Webmaster Tools helps you monitor your website and content problems.

Yahoo Site Explorer: Tn this yahoo site explorer, you track all index and link statistics of your site. Check the number of pages indexed from your site using the "site:URL" operator and link stats using the "link:URL" operator in Yahoo search to open site explorer.

Live Webmaster Tools: First of all sign up in MSN Live search to track the ranking. Also, check out the Live Webmaster forum where you need to register any complaints regarding the rankings of your blog.

Web Analytics Tools:

In order to observe the webtraffic to your blog or website, check out these free analytics services and implement one of them in your blog or website.

1. Google Analytics: This is the most popular traffic analysis program out there.

2. Sitemeter: A very popular free traffic stats application.

3. Statcounter: A highly comprehensive free traffic stats service.

Search Engine Blogs:

1. Google Webmaster Blog: This is the place where Google publishes any news and tips for webmasters.

2. Yahoo Search Blog: Yahoo's counterpart.

3. Live Webmaster Blog

Other Tools:

1. Sitemaps.org: Get the structure and design of your sitemaps right. Here is more information on sitemap submission.

2. W3C Validator: Check if your website's code is compatible with the W3C norms here.

3. The Open Directory: DMOZ (The Open Directory Project or ODP) is the most important and extensive web directory. Getting listed can be vital for your business.

4. SEO Analysis Tool: Here is the SEO analysis tool provided by SEOCentro. It gives you the details as to your site's Meta tags, size, load time, and more.

Friday, February 6, 2009

SEO Benefits

SEO BenefitsSEO Benefits

A BIP&SUM software provide a most excellent ways to promote yourself without paying excessive fees for promotion of your website. Generally, it offers one of the best returns on investments and marketing strategies once can take advantage.

SEO Benefits:

  • Boost your traffic.
  • Boost your website's visibility.
  • Help you get more sales or more leads.
  • Increase corporate relations.
  • Keep your web content fresh.
  • Help your company image.
  • Fixed Costs With SEO.
  • Web Standards / Accessibility

SEO Guidelines from Google, Design and Content Guidelines, Technical Guidelines, Quality Guidelines

SEO Guidelines from Google, Design and Content Guidelines, Technical Guidelines, Quality GuidelinesSEO Guidelines from Google

In this article or post I want o discusses about Google SEO guidelines. Google SEO guidelines will help to find, index, and rank your site will in search engines. Even if you you don’t want to implement any of these suggestions or to fallow, we strongly support you to pay very close concentration to the "Quality Guidelines, Design and content guidelines


Design and Content Guidelines:

  • Website should contain unique information and keyword rich.
  • Create a site map to your visitors. Sitemap file contain all important category’s links of your website. If your site map is crossing more than 100 links then, create the separate pages for site map.
  • Target that keyword for which visitors can find your website, and make sure that your website actually includes those words within it.
  • Use text instead of images to display important names, content, or links.
  • Write a unique title and description for all WebPages related to content.
  • Check and modify the broken links in the website.
  • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages.
  • Don’t place more then 100 links in a page.

Technical Guidelines:

  • Avoid using fancy features like such as JavaScript, cookies, session IDs, frames, DHTML, or Flash because most search engine spiders can’t crawl your website.
  • Allow search bots to crawl your sites without session IDs or arguments that track their path through the site.
  • Make sure your web server supports the If-Modified-Since HTTP header.
  • Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Place the file in root level of the website.
  • If your company buys a content management system, make sure that the system can export your content so that search engine spiders can crawl your site.

Quality Guidelines:

  • Avoid cloaking techniques and make pages for visitors, not for search engines.
  • Don’t use bad techniques to get instant ranking from search engines.
  • Don't participate in link schemes designed to increase your site's ranking or PageRank.
  • Don't use unauthorized computer programs to submit pages, check rankings, etc.
  • Avoid hidden text or hidden links.
  • Don't send automated queries to Google.
  • Don't stuff the pages with irrelevant words or a relevant keyowrds.
  • Don't create multiple pages, subdomains, or domains with substantially duplicate content.
  • Don't create pages that install viruses, trojans, or other badware.
  • If your site participates in an affiliate program, make sure that your site adds value.