Sunday, March 7, 2010

Attending SphinnCon 2010

Attended: SphinnCon Israel is the premier networking event designed for SEOs, SEMs, SMOs and affiliate marketers to exchange ideas, influence the industry and build their networks. Get a half day of sessions, keynotes, networking activities, and snacks.

Was very nice lectures:

SphinnCon Israel Agenda – March 7, 2010

Sunday – March 7, 2010
11:30am-12:00pm
(30 min)

Welcome Address
A welcome address by Barry Schwartz, SphinnCon Israel Chair, with an overview of the various sessions to come. Prof. Noah Dana-Picard, President of the Jerusalem College of Technology and Avi Kay, the Department Chair of Technology Management & Marketing of JCT, will also welcome the audience.

12:00pm-12:30pm
(30 min)

Snacks & Refreshments
Relax before the sessions begin with ample and scrumptious refreshments! You can use the time to network with experts in the search marketing space and just sit back and enjoy the food.

12:45pm-1:45pm
(1 hr)

SEO Track

SEO Fundamentals – This session gives those new to SEO a best practices overview of search engine optimization. Topics including keyword research, copywriting, search engine friendly design and common SEO issues will be covered in this panel.

Moderator: Barry Schwartz, News Editor Search Engine Land & CEO RustyBrick

Speakers:

Eli Feldblum, CTO & Founder, RankAbove
Gillian Muessig, President & Co-Founder, SEOMoz
Menachem Rosenbaum, TENS-Technology

Link Building Track

Link Building Techniques – Acquiring links for a site is not always an easy task. This session explores various techniques to make link acquisitions easier, quicker, and fun. Learn from our panel of experts the latest tips on how to acquire those important links. In this session, learn how to reach out and get quality links, how to craft anchor text to build authority, and how commonplace mistakes can destroy your web credibility (and search engine rankings).

Moderator: Dixon Jones, Managing Director of Receptional Ltd.

Speakers:

Ariel Ozick, WiredRhino
Gab Goldenberg, SEO ROI

Clinic Track

SEO Site Clinic – Have an issue with your site, want an SEO expert to help you in person? Try this session for expert SEO review advice.

Moderator: Vanessa Fox, Nine By Blue

Speakers:

Branko Rihtman, Whiteweb
Gilad Sasson, Search Marketing Analyst at nekuda.co.il
Olivier Amar, Whiteweb

1:45pm-2:00pm
(15 min)

Break

2:00pm-3:00pm
(1 hr)

Paid Search Track

Paid Search Tips – Want traffic now? Paid search is one of the easiest and quickest ways to get immediate traffic from the search engines. Learn tips on how to make the most of this traffic in this session. This session covers the basics of how to purchase placement from the major search engines, including best practices for success with your ads.

Moderator: Itay Paz, ItayPaz.net

Speakers:

Ophir Cohen, CEO Compucall Web Marketing
Ariel Sumeruk, Head of Business intelligence & CTO, Clicks2Customers
Dan Perach, Co-Founder, PPCPROZ
Naomi Sela, Media Director, Compucall Web Marketing

SEO Track

Hebrew SEO – Optimizing your sites for Google.com in English is what every search conference is about. We are in Israel, learn how to optimize for Google.co.il in Hebrew or English. This session is presented in English.

Moderator: Gilad Sasson, Search Marketing Analyst at nekuda.co.il

Speakers:

Oren Shatz, Chairman, SEO Israel Technologies Ltd.
Uri Breitman, TBWA\DIGITAL
Yuli Dasiatnikov, SEO, CompuCall
Adir Regev, CEO, GO Internet Marketing
Pavel Israelsky, blogger & SEO expert, AskPavel SEO Blog

SEO & Search Track

Online Reputation Management – Virtually all companies have to deal with upset customers. When those customers take their complaints online, you need to learn how to combat those in the search results. Learn tips on how to hide those bad search results and push up those positive results.

Moderator: Vanessa Fox, Nine By Blue

Speakers:

Sam Michaelson, Five Blocks
Gil Reich, Answers.com
Shira Abel, Abel Communications
Dan Gerstenfeld, Lecturer at JCT & CEO of Interteam

3:00pm-3:15pm
(15 min)

Break

3:15pm-4:15pm
(1 hr)

Social Track

Social Media Experts – Twitter, Facebook, Digg is the craze. Social Media is an excellent way to get the word out about your company. Learn how to leverage social media for buzz creation and link building.

Moderator: Barry Schwartz, News Editor Search Engine Land & CEO RustyBrick

Speakers:

Vanessa Fox, Nine By Blue
Miriam Schwab, CEO, illuminea
Debra Askanase, Owner, Community Organizer 2.0
Roi Carthy, Writer, TechCrunch

Web Analytics Track

SEM Web Analytics – Web analytics is the core to learning how to improve your SEM campaigns, be it PPC or SEO. Learning how to use web analytics to increase conversions can mean the difference between a profitable campaign & a costly campaign.

Moderator: Ophir Cohen, CEO Compucall Web Marketing

Speakers:

Michal Neufeld, Account Strategist, Google Israel
Daniel Waisberg, Head of Analytics, Easynet
Adir Regev, CEO, GO Internet Marketing

Clinic Track

Link Building Clinic – Want advice on how to acquire links specifically for your web site and niche? Use this session to ask link building experts for advice on your specific challenges.

Moderator: Branko Rihtman, Whiteweb

Speakers:

Eli Feldblum, CTO & Founder, RankAbove
Gillian Muessig, President & Co-Founder, SEOMoz
Olivier Amar, Whiteweb
Dixon Jones, Managing Director of Receptional Ltd.

4:15pm-4:30pm
(15 min)

Break

4:30pm-5:30pm
(1 hr)

Mobile Track

Mobile & Local Search – iPhones, Blackberries, Google Android, Palm Pre and other mobile devices have revolutionized search. Learn how to leverage local search to gain local traffic and leads for your business.

Moderator: Olivier Amar, Whiteweb

Speakers:

Olivier Amar, Whiteweb
THIS SESSION MAY BE CANCELLED

Site Clinic Track

PPC Site Clinic – Have your paid search campaigns running? Want tips on landing pages and ad copy? Have PPC experts give you advice in this session.

Moderator: Gillian Muessig, President & Co-Founder, SEOMoz

Speakers:

Charlie Kalech, Director, J-Town Productions
Dan Perach, Co-Founder, PPCPROZ
Shlomi Aizenberg, Director of SEM, easynet Search Marketing
Itay Paz, ItayPaz.net

SEO Track

Meet Google – Tomer Honen from the Google Search Quality Team will talk about the most common SEO topics Google runs into. Tomer will speak for about 45 minutes on topics from SEO, Webmaster Tools, Best Practices and other webmaster related topics relevant to Google. He will then leave time for question and answer from the audience. This is your chance to speak to a live Google representative about your issues.

Tomer Joined Google in 2006 and works as a Search Quality Strategist to support the search quality efforts across the European languages, particularly Hebrew. Prior to joining Google, he worked as a malware researcher in Aladdin Knowledge Systems, based in Israel. Tomer has a bachelor’s degree in theater and English literature.

Moderator: Barry Schwartz, News Editor Search Engine Land & CEO RustyBrick

Speakers:

Tomer Honen, Search Quality Strategist, Google

7 PM
Tel Aviv Party

SphinnCon Israel – Tel Aviv Party
Join us in Tel Aviv for a networking party sponsored by TBD.


Wednesday, February 24, 2010

SEO Basis or Why we need SEO?

You have a beautiful website with great products, great guarantees, many comprehensive pages and great customer service. Unfortunately, Google and other search engines won't give your website rankings.

There are several reasons why search engines do not list websites although they look great and offer quality content:

  1. Your web pages are meaningless to search engine spiders
    Search engines use simple software programs to visit your web pages. In general, search engine spiders won't see anything that is displayed in images, Flash elements, and other multimedia formats.
  2. The HTML code of your web page contains major errors
    Most web pages have minor errors in their HTML code. While most search engine spiders can handle minor HTML code errors, some errors can prevent search engine spiders from indexing your web pages,Example, a tag at the top of your web pages could tell search engine spiders that they have reached the end of the page although the main content of the page has not been indexed yet.
  3. The HTML code of your web pages doesn't contain the right elements
    If you want to get high rankings for certain keywords then these keywords must appear in the right places on your web page. For example, it usually helps to use the keyword in the web page title.There are many other elements that are important if you want to have high rankings. All of them should be in place if you want to get high rankings.
  4. Your web server sends the wrong status codes
  5. Some web servers send wrong status codes to search engine spiders and visitors. When a search engine spider requests a web page from your site then your server sends a response code. This should be the "200 OK" code.
  6. Your robots.txt file rejects all search engine spiders,If your robots.txt file does not allow search engine spiders to visit your web pages then your website won't be included in the search results. Some robots.txt file contain errors and search engine spiders are blocked by mistake.

Wednesday, February 17, 2010

U.S. Web Searches Top 10.2 Billion in January | Nielsen Wire

Monthly searches hit's 10 Billion

Top 10 Search Providers for January 2010, Ranked by Searches (U.S.)
Rank Provider Searches (000) Share of Searches

All Search 10,272,099 100.0%
1 Google Search 6,805,424 66.3%
2 Yahoo! Search 1,488,476 14.5%
3 MSN/Windows Live/Bing Search 1,116,546 10.9%
4 AOL Search 251,762 2.5%
5 Ask.com Search 194,161 1.9%
6 My Web Search 112,356 1.1%
7 Comcast Search 59,608 0.6%
8 Yellow Pages Search 35,101 0.3%
9 NexTag Search 34,736 0.3%
10 BizRate Search 20,123 0.2%

Where are those question went before the Internet Era?

Source:U.S. Web Searches Top 10.2 Billion in January | Nielsen Wire

Wednesday, February 10, 2010

Google AdWords Negative Keywords

If you advertise your website on Google AdWords, chances are that you found out that you can spent a lot of money on AdWords without getting a lot in return.

The reason why is that they use the wrong settings in their campaigns.

Long tail keywords convert better and there are a lot of them

Several studies found out that long tail keywords have a much higher conversion rate than single word keywords.
Long tail keywords keywords that consists of 4 or more words.

According to a recent study, more than 18% of searches contain five or more keywords. In addition, Google says that "20% of the queries Google receives each day are ones [they] haven't seen in at least 90 days, if at all."

The facts above indicate that it might be a good idea to use broad match for all of your keywords.

Google recommend to use broad match with your keywords:

"Broad match is a great way to capitalize on those unexpected, but relevant queries. When you include a keyword as a broad match, your corresponding ad is not only eligible to appear alongside queries with that exact spelling, but it can also capture keyword expansions that include synonyms, singular/plural forms, relevant variants of your keywords, and phrases containing your keywords."

Unfortunately, it's not that easy. If you use broad match for all keywords then your ads will be shown for a lot of unrelated searches and you'll pay a lot of money without getting something in return.

For that reason, it is important to exclude the long tail keywords that are not related to your website.

Negative keywords will increase your conversion rate

You can enter so-called negative keywords in your Google AdWords campaigns.

If a long tail keyword contains one of your negative keywords, your AdWords ads won't be displayed.

For example, if you enter "-free" as a negative keyword, your AdWords ad is not displayed if someone searches for free things. Negative keywords are an excellent tool for excluding Internet users looking for free items only.

You can also use negative keywords to display an ad for specific target groups.
If one of your keywords has multiple meanings ("leonardo" would trigger "leonardo dicaprio" and "leonardo davinchi") then you should add negative keywords that remove the unrelated searches.

Broad match can help you to get more customers but you have to be very careful with that option. If you use broad match with your keywords.

Remember to use negative keywords so that you don't pay for unwanted traffic.

Monday, February 1, 2010

Google Algorithm History Summary 2010

Google Algorithm History:
  1. The Jagger Update and the Big Daddy infrastructure that it prepared the way for was a major watershed. When this happened near the end of 2005, ever-flux began to show in the SERPs. Rather than once a month ranking updates, the ranking shuffle became continual.
    Source Monthly Google History: http://www.webmasterworld.com/google/3801699.htm
  2. Google’s war on paid links that began as far back as 2005 raised quite a ruckus. At first Google’s negative actions were taken manually and then algorithmically. Algorithmic false positives began to confuse things even more, and I wish they would have just stopped with showing false Page-rank on the toolbar.
  3. Phrase-based indexing, as described in the 2006 patents, brought a deeper level of semantic intelligence to the search results. This power continues to grow today. One big effect - it makes over-emphasis on keywords, especially in anchor text, a problem when it used to be an asset. But there was a major advantage for the content writer who could now throw off the rigidity to major degree and vary their vocabulary in a more natural way.
    source: http://www.webmasterworld.com/google/3247207.htm
  4. Geo-located results began to create different rankings even for various areas of the same US and UK city somewhere around 2005 or so. Anyone who was still chasing raw rankings as their only metric should have quickly learned that the time for a change was long overdue.
  5. Google’s user "intention engine" has had a major impact, and that rolled out in a big way in 2009. This was coupled with a kind of automated taxonomy of query terms. Now, sometimes a certain kind of site will just never rank for a certain keyword, no matter what they try. The site’s taxonomy has to line up with the taxonomy of the query term.
    source: http://www.webmasterworld.com/google/3980481.htm

Wednesday, January 6, 2010

How to get in Google's real-time results

Google hasn't revealed how the real-time results are chosen but there are some things that seem to have an effect. If you want to see your tweets in Google's real-time results, you should consider the following:

1. Google analyzes the text used in the blog post, tweet, etc. to determine the quality of the post. If a post looks like spam, it won't be chosen for the real-time results.

2. Google seems to create profiles of users that are re-tweeted more often than other users. If a Twitter user has many high-authority followers, it's more likely that the tweets will appear in the real-time results.

3. Google wants to identify spammers by the quality of their followers and the quality of their messages.

The collected information will be used to calculate an "Update Rank" for each contributing user.

How much time should you invest to get in Google's real-time results?

The problem with Google's real-time results is that they do not appear for all search terms and that each result is quickly replaced by another result.

For that reason, you cannot get lasting results in Google's real-time results. It is much better to invest your time in getting listed in Google's regular search results. A listing in Google's regular top 10 results will bring many more targeted visitors to your website than a listing in the real-time results.

Google's real-time results are a nice addition to Google's portfolio.

Wednesday, December 16, 2009

Duplicate or Not to Duplicate. What is duplicate content?

What is duplicate content?

The patent contains a definition of duplicate content:

"Duplicate documents are documents that have substantially identical content, and in some embodiments wholly identical content, but different document addresses."

The patent describes three scenarios in which duplicate documents are encountered by a web crawler:

1. Two pages, comprising any combination of regular web page(s) and temporary redirect page(s), are duplicate documents if they share the same page content, but have different URLs.

2. Two temporary redirect pages are duplicate documents if they share the same target URL, but have different source URLs.

3. A regular web page and a temporary redirect page are duplicate documents if the URL of the regular web page is the target URL of the temporary redirect page or the content of the regular web page is the same as that of the temporary redirect page.

A permanent redirect page is not directly involved in duplicate document detection because the crawlers are configured not to download the content of the redirecting page.

How does Google detect duplicate content?

According to the patent description, Google's web crawler consults the duplicate content server to check if a found page is a copy of another document. The algorithm then determines which version is the most important version.

Google can use different methods to detect duplicate content. For example, Google might take "content fingerprints" and compare them when a new web page is found.

Interestingly, it's not always the page with the highest PageRank that is chosen as the most important URL for the content:

"In some embodiments, a canonical page of an equivalence class is not necessarily the document that has the highest score (e.g., the highest page rank or other query-independent metric)."

How does this affect your website?

If you want to get high rankings, it is easier to do so with unique content. Try to use as much original content as possible on your web pages.

If your website must use the same content as another website, make sure that your website has better inbound links than the other websites that carry the same content. It's likely that your website will be chosen as the most important URL for the content then.

If your web site has unique content, you don't have to worry about potential duplicate content penalties. Optimize that content for search engines and make sure that your web site has good inbound links.

Wednesday, December 9, 2009

Google Cafeine whats new ?

Google Caffeine is the name given to Google's next algorithm update that is going live after the holidays. It seems that Google Caffeine will be more than Google's regular updates. It will probably be a major overhaul of the calculations that Google uses to rank web pages.

Caffeine What is going to change?

Of course, Google hasn't revealed the details of Google Caffeine yet. However, the new index has been live on some test servers and some Google employees also talked about the next index. The following factors might play a larger role in Google's next index:

1. Website speed: if you have a slow loading website, it might not get high rankings on Google.

2. Broken links: if your website contains many broken links, this might have a negative impact of the position of your web pages in Google search results.

3. Bad neighborhoods: Linking to known spammers and getting a lot of links from known spammers isn't good for your rankings in Google's current algorithm. The negative impact of a bad neighborhood will probably be even worse with Google Caffeine.

4. The over-all quality of your website: Google's new algorithm probably will take a closer look at the over-all quality of your website. It's not enough to have one or two ranking factors in place.

5. You'll probably need good optimized content, a good website design with a clear navigation, good inbound links, a low bounce rate, etc. The number of social bookmarks might also play an increased role.

6. Factors like the age of a website, its past history, authority etc. will still play a role in Google's new index. However, the effect of the different factors on your rankings will shift.

How can you adjust your web pages to Google's new Caffeine index?

Although Google's Caffeine update hasn't been release yet, there are some things that you can do to increase the chances that your website will get good rankings in Google's new index:

1. Remove all spam elements from your web pages. Anything that might be considered spam can and will have a negative effect on the position of your web pages sooner or later. This includes text that has nearly the same color as the background, cloaking and fully automated linking systems.

2. Check your website design and the navigation of your website. Your website should have a professional look and feel. The navigation should be easy to understand and your web pages should easily be parseable by search engine spiders. You can test this with the search engine spider simulator in
3. Get links from social bookmark websites. Social bookmark links already play a role in Google's current algorithm and that role might increase.

4. Check your links. You shouldn't link to websites that look like spammers. It's better to focus on selected quality links instead of as many links as possible.

Google Caffeine is going to be released after the holidays. If you follow the tips above, your website will be in a good position when Google's new index will be online.

Tuesday, December 1, 2009

Once upon a time.....

Once upon a time there was one very hot-tempered and short-tempered young man. Then one day his father gave him a bag of nails and told every time he did not keep his anger to drive one nail into the fence post.

On the first day in the column were several dozen nails. The next week, he learned to restrain his anger, and with each passing day the number of nails hammered into pole began to decrease. The young man realized that it is easier to control his temper than to hammer nails.

Finally the day came when he never lost his composure. He told this to his father and he said that this time every day, when his son will be able to control himself, he can pull out from the column on the bottom nail.

Time passed, and the day came when he could tell his father that in a column not a single nail. Then the father took his son by the hand and led him to the fence:

- Do a good job, but you see how many holes in a column? He will never be like before. When you tell people something evil, he remains as a scar, as these holes. And no matter how many times after that you're sorry - the scar remains.

Friday, October 23, 2009

Folowing the Big G -- Barcodes :-)

Late a bit but here it is , Now you ca use your bar-code reader on SEOgenie too.

Tuesday, October 6, 2009

Nice Feature in Google

This Side wiki is a nice thing...,

But do it display it in search results?

Thic can be a nice feature.

in reference to: Google (view on Google Sidewiki)

Saturday, October 3, 2009

Generating quality backlinks to your site

There are many ways of generating quality backlinks to your site without resorting to black hat methods. Some of these methods are ones that I have not personally tried, but that I know are extremely effective at generating links and traffic to a website:

  • Press releases (Especially at one of the bigger PR sites: PRNewswire, PRWeb, MarketWire, Business Wire, etc-).

  • Guest posting at other websites can send a ton of traffic to your site, not to mention give you a great backlink from a well-respected site.

  • Write articles for eHow, Associated Content, or start a Squidoo lens that links back to your site.

  • Become an active member of a forum and put a link to your site in your signature.

  • Submit to social bookmarking sites, especially ones that offer do-follow links, such as Folkd.com. Just don't spam them!

  • If you have a blog, submit it to BlogCatalog.com and you'll get a ton of free backlinks over time.

Friday, October 2, 2009

Toolbar page rank

Toolbar page rank is calculated using ranges of real page rank. For example, imagine that the maximum Google PR value is 1,000,000. Toolbar PR 1 equals a range of 1 - 10. Toolbar PR 2 equals a range of 10 - 50. Toolbar PR 3 equals a range of 50 to 1000. Toolbar PR 4 equals a range of 1000 - 10,000 and so on, until finally toolbar PR 9 ranges from 600,000 - 1,000,000.

Wednesday, September 2, 2009

Manual Google

If you read the above quotation carefully, you'll see that Google relies "heavily" but not fully on computer algorithms. In a recent interview, Google's Engineering director Scott Huffman explained that a lot of human reviewers (probably 10,000) manually review Google's results:

"Every day, we are looking at a random sample of grades that we think represent the queries we get from users. Evaluators look at the quality of each result relative to those queries."

Real people take a look at Google's search results and they tell Google which websites they don't like. If your website looks spammy, it might be reported to Google by these people.

Sunday, August 16, 2009

What is "White Hat" Vs. "Black Hat"

The search engine optimization industry is often divided into two camps. "White Hat" SEO's attempt to work within the rules and goals of the search engines.

"Black Hat" SEO's try to gain rankings using whatever methods work best, often deliberately working against the interests of the search engines and their users. Automated software tools for "Black Hat" SEO have been around almost as long as search engines themselves. SEO Genie the first major software tool to automate "White Hat" search engine optimization.

Tuesday, August 4, 2009

12 IT Holly Places

Those are hardly the only ones. We've identified the 12 most sacred places where IT enthusiasts can go to pay homage to the computing gods that passed before them -- or at least catch a peek at where some of the more exciting events in IT lore occurred. Fortunately, would-be pilgrims can do a lot of the traveling via the Web, saving wear and tear on the sandals and sackcloth.

Tech mecca No. 1: 367 Addison Ave., Palo Alto, Calif. [4]
Tech mecca No. 2: 2066 Crist Dr., Los Altos, Calif. [4]
Tech mecca No. 3: 232 Santa Margarita Ave., Menlo Park, Calif. [4]
Tech mecca No. 4: CERN -- Geneva, Switzerland [5]
Tech mecca No. 5: Bletchley Park, England [6]
Tech mecca No. 6: Xerox PARC -- Palo Alto, Calif. [7]
Tech mecca No. 7: Ames Lab, Iowa State University -- Ames, Iowa [8]
Tech mecca No. 8: Moore School of Engineering, University of Pennsylvania -- Philadelphia [8]
Tech mecca No. 9: IBM's "Main Plant" -- Poughkeepsie, N.Y. [9]
Tech mecca No. 10: Room 2713, Dobie Hall, University of Texas -- Austin, Texas [10]
Tech mecca No. 11: Kirkland House, Havard University -- Cambridge, Mass. [10]
Tech mecca No. 12: Lyman Residence Hall, Stanford University -- Stanford, Calif. [10]

Monday, July 13, 2009

The Time Has Come To Regulate Search Engine Marketing And SEO

The following post was written by a well known executive at one of the largest sites on the Internet. The author has requested to remain anonymous - not for dramatic effect, but because of the backlash he would receive from the SEO industry and possibly Google itself. He also doesn’t want his company associated with the post.

He is starting a discussion on the need for government regulation of the organic and paid search policies of Google, which maintains a commanding lead in search market share today. Or at least transparency in how search results are determined. There is clearly growing frustration on the constantly changing “border policies” that are created and enforced by Google and other search engines. It is a fascinating read..... Read The Full