Affiliate Programs · SEO · WebMaster Resources

Fixing Google Web 2.0 Style IV

How to fix what is broken and not break what is not 

This post is part of a series. See Part 1, Part 2 and Part 3

What to Make Out of It
Some results should make the link being ignored completely and even reflect poorly on the webmaster (trust), some should get full voting power and even extra power for B=0, S=0, E=10 ratings for example and everything in between depending on how much the information is trusted and other strategic decisions made by the SE.

Those attributions are on-site factors and easy to manipulate by the webmaster. However, Google has information from other sites as well, that make a statement about the page. Not on the site itself, but all other sites that refer to it. It should be possible to compare what the webmaster says about him to what other webmasters say and what Google knows already about both sites. Google supposedly knows[i] a lot to detect linking schemes and other artificial link networks.

Spam on .EDU sites will become much easier to detect, unless universities are starting to link to ecommerce sites en mass rather than other educational content. A site that does reviews tends to link to ecommerce sites with the intention to have the customer buy it if he likes the review. Those are just two examples that show how it could help in detecting spam.

The attributes I presented are only an example. I used them to demonstrate how webmasters could express and communicate different important information to the search engines. Similar or even completely different attributes might be more useful to achieve the same thing at the end.

Conclusion
Not everybody will adopt using these or similar attributes, especially right away, Google must set them to something themselves based on factors they know already, and in comparison with other sites that are similar and provided attributes.

Match this with the intention of the user that uses the search engine.  

Yahoo! and Microsoft are both experimenting with matching user intention to search results and I am sure that Google also works on this problem.

Yahoo! Mindset[ii] for example let the user express their intent with a search on a sliding scale from 1 to 10 where 1 stands for shopping and 10 for researching and 2-9 for everything in between but with the ability to put more or less weight on the one or the other.

Microsoft has at their MS adCenter Labs a search feature called “Detecting Online Commercial Intention[iii]”.

There are two possible options. Either you enter a query of keywords or key phrases or you enter a website URL.

The tool will return a number between 0.0001 and 1.00000 as a result. The closer the number is to 1.00000, the higher the determined probability that the query or webpage has a commercial intend.

Those are steps into the right direction. Give the users what they want and let honest Webmasters help you with matching them with their sites and to wheat out spam.

If the user wants to “visit Disneyland”, show reviews and sites to book for it. If the user wants to “buy porn”, show the user a site where he can buy porn. In order for Google to improve on that, is it necessary that webmasters will be honest and over time trusted more by Google; and the benefits webmasters will get from it will make them do it.

Carsten Cumbrowski
Cumbrowski.com – Internet Marketing Resources



[i] Rustybrick (17. November 2005), “Google Knows Link Networks Well”, Search Engine Roundtable

[ii] Yahoo! Mindset, experimental search, Yahoo!

[iii]Detecting Online Commercial Intention”, MS adCenter Labs, Determine probable commercial intention based on query or URL

 Fixing Google Web 2.0 Style IV
Carsten Cumbrowski has years of experience in Affiliate Marketing and knows both sides of the business as the Affiliate and Affiliate Manager. Carsten has over 10 years experience in Web Development and 20 years in programming and computers in general. He has a personal Internet Marketing Resources site at Cumbrowski.com.To learn more about Carsten, check out the "About Page" at his web site. For additional contact options see this page.

You Might Also Like

Comments are closed.

7 thoughts on “Fixing Google Web 2.0 Style IV

  1. I hadn’t seen the Microsoft commercial intent tool before. It is interesting.

    I tested one of my affiliate sites that gets good traffic from MSN.

    Result: NonCommercial (Page)

    Probabilities for Each OCI Type:
    NonCommercial Prob.: 0.98638
    Commercial-Informational Prob.: 1.3001e-002
    Commercial-Transactional Prob.: 6.1891e-004

    I suppose that classes as under the radar

    It does have lots of good information not available elsewhere, and tons of links to external edu and gov resources.

  2. So I assume that the page was not a product detail page with an “add to cart” button but provided information to the product, what it does, benefits etc. may be a review?

    It does not sound like a sales letter page to me.

    Does the affiliate link go right into a product detail page that does only have little information about the product with the clear purpose to buy the product here and now?

    The MS tool is a beta for testing anyway, but the direction is the right one. Provide webmasters the ability to indicate intention, verify that against the indicated intentions of inbound links and weigh each intention expressed against how much you trust the site itself and the sites that expressed intentions via their links.

    If all things (on page factors and expressed intentions from internal and external links ) say “BUY THIS”, send to the page Users that expressed their intend to BUY THIS.

    If the users intention is BUY SOMETHING LIKE THIS, send them over to the soft sell page with some additional benefits and a free trial to check it out.

    If the intention is I AM A STUDENT FOR 20 YEARS AND (CAN’T) BUY NOTHING, send him to my competitor and waste his bandwidth :)

  3. Well the site I tested was looked on as a lot less commercial than my public blog, and has a lot more advertising, and much less content.

    Adsense on every page
    Chitika mini mall on a product page
    The rest of the products pages are mainly made up of linkshare and adsense
    There is also a page that has quite a few Clickbank links through a redirect.

    Maybe the tone of the information that is actually provided is very “research” and not sales.

  4. See, but to improve on figureing that out is a step into the right direction IMO.

    The public efforts for this kind of research by Yahoo! and Microsoft should be noted and honored.

    Don’t you think?

  5. First time reader on your site, and I really like it.
    heres the deal:
    Google has this problem because they try to be very website friendly. Google could change the way they index websites and reduce the spam, but then websites that depend more on Diggs and stuff, like my personal blog, would not get as high in search results.

  6. SEs are getting smarter with each following year. Making a spider behaves and think with a normal human mindset can easily detect paid reviews because more than 80% of paid reviews are done because the webmasters hope that the readers will be convinced by the reviews and purchase the products.

  7. I hope the robots can really detect paid reviews because more and more websites are buying paid links and reviews that are appearing throughout the search results which makes the SERPs irrelevant.