Wednesday, February 24, 2010

SEO Basis or Why we need SEO?

You have a beautiful website with great products, great guarantees, many comprehensive pages and great customer service. Unfortunately, Google and other search engines won't give your website rankings.

There are several reasons why search engines do not list websites although they look great and offer quality content:

  1. Your web pages are meaningless to search engine spiders
    Search engines use simple software programs to visit your web pages. In general, search engine spiders won't see anything that is displayed in images, Flash elements, and other multimedia formats.
  2. The HTML code of your web page contains major errors
    Most web pages have minor errors in their HTML code. While most search engine spiders can handle minor HTML code errors, some errors can prevent search engine spiders from indexing your web pages,Example, a tag at the top of your web pages could tell search engine spiders that they have reached the end of the page although the main content of the page has not been indexed yet.
  3. The HTML code of your web pages doesn't contain the right elements
    If you want to get high rankings for certain keywords then these keywords must appear in the right places on your web page. For example, it usually helps to use the keyword in the web page title.There are many other elements that are important if you want to have high rankings. All of them should be in place if you want to get high rankings.
  4. Your web server sends the wrong status codes
  5. Some web servers send wrong status codes to search engine spiders and visitors. When a search engine spider requests a web page from your site then your server sends a response code. This should be the "200 OK" code.
  6. Your robots.txt file rejects all search engine spiders,If your robots.txt file does not allow search engine spiders to visit your web pages then your website won't be included in the search results. Some robots.txt file contain errors and search engine spiders are blocked by mistake.

Wednesday, February 17, 2010

U.S. Web Searches Top 10.2 Billion in January | Nielsen Wire

Monthly searches hit's 10 Billion

Top 10 Search Providers for January 2010, Ranked by Searches (U.S.)
Rank Provider Searches (000) Share of Searches

All Search 10,272,099 100.0%
1 Google Search 6,805,424 66.3%
2 Yahoo! Search 1,488,476 14.5%
3 MSN/Windows Live/Bing Search 1,116,546 10.9%
4 AOL Search 251,762 2.5%
5 Ask.com Search 194,161 1.9%
6 My Web Search 112,356 1.1%
7 Comcast Search 59,608 0.6%
8 Yellow Pages Search 35,101 0.3%
9 NexTag Search 34,736 0.3%
10 BizRate Search 20,123 0.2%

Where are those question went before the Internet Era?

Source:U.S. Web Searches Top 10.2 Billion in January | Nielsen Wire

Wednesday, February 10, 2010

Google AdWords Negative Keywords

If you advertise your website on Google AdWords, chances are that you found out that you can spent a lot of money on AdWords without getting a lot in return.

The reason why is that they use the wrong settings in their campaigns.

Long tail keywords convert better and there are a lot of them

Several studies found out that long tail keywords have a much higher conversion rate than single word keywords.
Long tail keywords keywords that consists of 4 or more words.

According to a recent study, more than 18% of searches contain five or more keywords. In addition, Google says that "20% of the queries Google receives each day are ones [they] haven't seen in at least 90 days, if at all."

The facts above indicate that it might be a good idea to use broad match for all of your keywords.

Google recommend to use broad match with your keywords:

"Broad match is a great way to capitalize on those unexpected, but relevant queries. When you include a keyword as a broad match, your corresponding ad is not only eligible to appear alongside queries with that exact spelling, but it can also capture keyword expansions that include synonyms, singular/plural forms, relevant variants of your keywords, and phrases containing your keywords."

Unfortunately, it's not that easy. If you use broad match for all keywords then your ads will be shown for a lot of unrelated searches and you'll pay a lot of money without getting something in return.

For that reason, it is important to exclude the long tail keywords that are not related to your website.

Negative keywords will increase your conversion rate

You can enter so-called negative keywords in your Google AdWords campaigns.

If a long tail keyword contains one of your negative keywords, your AdWords ads won't be displayed.

For example, if you enter "-free" as a negative keyword, your AdWords ad is not displayed if someone searches for free things. Negative keywords are an excellent tool for excluding Internet users looking for free items only.

You can also use negative keywords to display an ad for specific target groups.
If one of your keywords has multiple meanings ("leonardo" would trigger "leonardo dicaprio" and "leonardo davinchi") then you should add negative keywords that remove the unrelated searches.

Broad match can help you to get more customers but you have to be very careful with that option. If you use broad match with your keywords.

Remember to use negative keywords so that you don't pay for unwanted traffic.

Monday, February 1, 2010

Google Algorithm History Summary 2010

Google Algorithm History:
  1. The Jagger Update and the Big Daddy infrastructure that it prepared the way for was a major watershed. When this happened near the end of 2005, ever-flux began to show in the SERPs. Rather than once a month ranking updates, the ranking shuffle became continual.
    Source Monthly Google History: http://www.webmasterworld.com/google/3801699.htm
  2. Google’s war on paid links that began as far back as 2005 raised quite a ruckus. At first Google’s negative actions were taken manually and then algorithmically. Algorithmic false positives began to confuse things even more, and I wish they would have just stopped with showing false Page-rank on the toolbar.
  3. Phrase-based indexing, as described in the 2006 patents, brought a deeper level of semantic intelligence to the search results. This power continues to grow today. One big effect - it makes over-emphasis on keywords, especially in anchor text, a problem when it used to be an asset. But there was a major advantage for the content writer who could now throw off the rigidity to major degree and vary their vocabulary in a more natural way.
    source: http://www.webmasterworld.com/google/3247207.htm
  4. Geo-located results began to create different rankings even for various areas of the same US and UK city somewhere around 2005 or so. Anyone who was still chasing raw rankings as their only metric should have quickly learned that the time for a change was long overdue.
  5. Google’s user "intention engine" has had a major impact, and that rolled out in a big way in 2009. This was coupled with a kind of automated taxonomy of query terms. Now, sometimes a certain kind of site will just never rank for a certain keyword, no matter what they try. The site’s taxonomy has to line up with the taxonomy of the query term.
    source: http://www.webmasterworld.com/google/3980481.htm