Affiliate Programs · SEO · WebMaster Resources

Fixing Google Web 2.0 Style

How to fix what is broken and not break what is not 

This is a series of four posts in total. See Part 2, Part 3 and Part 4 

I wrote in December that PageRank is Dead[i] and that something such as abusing the NOFOLLOW attribute for other purposes (The intention for NOFOLLOW were not what Google is pushing for today) would not save it. Around that time did Tetsuto Yabuki aka Halfdeck wrote at his blog at SEO4Fun.com why Google will not move away from PageRank[ii].

Two conflicting opinions to the same thing it seems. Are they really conflicting each other? I would say today that they are not, they are simply incomplete, both of them.

Things are heating up. Actually, they do that for over two years already, but I think we are getting closer to the boiling point and things might start getting ugly. Some might say that they already are.

I believe that most people would agree with my statement, that Google’s search results are getting worse more and more and that spam heavily pollutes them.

Back in Time – “Florida” Update Aftermath
Link spam[iii] became an increasing problem after Google did wheat out cross-linking schemes and link-farm swindles with their infamous “Florida” update at the end of 2003, rendering those methods of boosting a sites ranking virtually useless. The rise of blogs and social media aided this method of “gaming the search engines”. Comment and trackback spam did not only annoy the search engines, but the people that operate the blogs or community sites as well.

The Solution Called “NOFOLLOW”
Google introduced[iv] In January 2005 the NOFOLLOW attribute to the world as a solution to solve the comment spam problem, at least for Google. Other search engines followed and adapted the support of the attribute in similar fashion. 

It did not take long for everybody to realize that the NOFOLLOW attribute does not work the way everybody hoped. Instead of admitting failure did Google started to change course and to zero in on other issues. The other issue is more of an issue for Google than it is for webmasters or users, but the NOFOLLOW attribute might comes in handy to take care of it. The issue I am referring to is Google’s problem with “Paid links”.

Change of Plans
Matt Cutts from Google made a comment[v] already on 14 August 2005 on Tim O’Reilly’s Blog where he suggests using the NOFOLLOW attribute for “paid links”, but it was not getting very much attention at that time. He mentioned it again[vi] a month later at his blog, which sparked a short but intense debate, which was, replaced soon after by something else that required more attention, another major Google update was just starting around the same time.

Interrupted by “Jagger”
The infamous “Jagger Update” started at end of 2005 followed by the, so-called, infrastructure update called “BigDaddy”. Both updates did little to stop the negative trend, but caused a whole lot of other issues instead. Who heard about duplicate content issues and penalties or knew what a canonical URL is back during the holidays in 2004? I did not even know the word “canonical” before early 2006, when Matt Cutts mentioned it at his blog. I only vaguely remember to have heard the German word “kanonisch” before, at some math lesson. I also did not spend much time in Christian churches to get a chance to pick it up there.

These new problems overshadowed the “paid links” discussions for the most part of 2006. “Paid link” where discussed and the NOFOLLOW attribute was mentioned in combination with it too. The discussion headed up again at the end of 2006.

Picking Up Again
Criticism for being hypocritical[vii] is only one complaint about Google’s position and intentions. The argument, that Google’s motto: “build websites for the user and not for search engines” is becoming worthless is not correct. Search engines play an important role in the internet economy and no business can afford to ignore them for too long. Creating the things “under the hood” which are invisible to the user, but visible to the search engines in a way that search engines can access and understand it, is something you should not simply neglect and not do.

Some things on a page can be ambiguous and it makes perfect sense to me, if webmasters have a way of telling search engines  what they mean by it and what the intend and purpose is.  Search engines cannot read your mind and ranking algorithms are not working based on artificial intelligence that are able to comprehend the content of page, understand the purpose of a site and know the person who created it.

Maybe they will one day, but that day will not be in the very near future. Until then is it necessary to work on alternatives to solve today’s problems. If the replacement of big general search engines for everything by niche oriented vertical search engines[viii] might happens along the way, fine with me. It is only one of various possible options, which I will not simply ignore.

Webmasters will help search engines to understand and to clarify ambiguities, if they will benefit from it and not being stabbed into the back instead. That does not necessarily mean getting more traffic, but more targeted traffic that converts better.

Google decided to take things a step further[ix] and encourage people to defame[x] other people that “misbehave”. I call it defamation, because reporting makes only sense in cases that are not obvious and thus already known to Google. Using defamation as instrument to control people and people using the instrument against each other for personal gain or self-protection is one of the not so favorable virtues of man[xi]. This one is different from the spam report form at Webmaster Central.  The console allows only the report of spam in the search index that clearly violates Google’s guidelines. The idea there is to rid the SERPS from this clutter entirely. What is the idea behind the report of a paid link on one site that is perfectly fine, removing it from the Google index? You are kidding me, right?

Continue with Fixing Google Web 2.0 Style – Part 2 of 4

Carsten Cumbrowski
Cumbrowski.com – Internet Marketing Resources



[i] Carsten Cumbrowski (21. December, 2006), “PageRank is dead and NOFOLLOW will NOT Save it!“, SearchEngineJournal.com

[ii] Tetsuto Yabuki aka Halfdeck (13. December 2006), “Why Google Will Not Move Away From PageRank” , SEO4Fun.com

[iii] Charles Arthur (31.  January 2005), Interview with a link spammerThe Register (UK)

[iv] Matt Cutts / Google Software Enginer, Jason Shellen / Blogger Program Manager (January 18, 2005), “Preventing comment spam“,  Official Google Blog

[v] Tim O’Reilly (23. August 2005), “Search Engine Spam?”, O’Reilly Radar, See: Matt Cutts second comment posted on 08.24.05 09:31 AM.

[vi] Matt Cutts (1. September 2005), “Text links and PageRank”, MattCuts.com

[vii] Michael Gray aka Graywolf (25.  January 2007), “Google’s Policy on No follow and Reviews is Hypocritical and Wrong“, Wolf-Howl.com

[viii] Jason Prescott (27. April 2007), “ Is Google Killing SEO?”, iMediaConnection.com

[ix]Raj Dash (15. April 2007), “Google To Go After Paid Links”, SearchEngineJournal.com

[x] Matt Cutts (14. April 2007), “How to report paid links”, MattCutts.com

[xi] Carsten Cumbrowski (15. April 2007), Comment about mistrust and Stasi methods at Matt Cutts Blog, MattCutts.com

 Fixing Google Web 2.0 Style
Carsten Cumbrowski has years of experience in Affiliate Marketing and knows both sides of the business as the Affiliate and Affiliate Manager. Carsten has over 10 years experience in Web Development and 20 years in programming and computers in general. He has a personal Internet Marketing Resources site at Cumbrowski.com.To learn more about Carsten, check out the "About Page" at his web site. For additional contact options see this page.

Comments are closed.

18 thoughts on “Fixing Google Web 2.0 Style

  1. Thanks for the free editorial non-paid links :D

    If Google just wanted to devalue paid links, nofollow wouldn’t be necessary. Instead, Google wants to punish link sellers. But without nofollow, algorithmically punishing link sellers/buyers would result in too much collateral damage. That’s why Matt Cutts is issuing a warning ahead of time.

  2. Tetsuto, you know where to send the check to, I mean the “thank you card”. :)

    And you seriously believe that they will get the serious link buyers and sellers that do this to game the engines ? Normal webmasters with no bad intention will not be the ones left in the rain?

    Really?

    Maybe I am living in a parallel universe, but where I live does it not work that way.

    I bet that conferences will see an increase in attendance. It is the best place to continue business as usual and to have a laugh at Google and the poor webmasters that will try to find out what hit them and why.

    Links will become more expensive though. A normal thing to happen if something in high demand is taken off the open market. Because the demand will not change as a result of this.

    I should consider jumping into the link selling business. Great profit margins, treated nicely by everybody, perks left and right…

    “thank you ,thank you. You are so nice. I kept you a nice and clean .EDU link, because I like you so much.”

    Living the life of a pimp without breaking the law. Great!

  3. Well, I think google is working on this problem right now. They have launched “web history” which is automatically enabled if anyone logs into their google account and it tracks and records in full text any and all sites the user visits and makes them available to use later. By doing this, Google will be able to determine which sites are being visited more than others, how long a user is staying on each site, and if or how often they return to the site. Using this information, they will be able more effectively categorize and arrange its seach results for each individual query.

  4. That link above is the article I read about google. I don’t know if it will work or not.

  5. It’s time to bring Google down to earth and the only I feel it can happen is through competition. I’m already have a brainstorming session on my site and hope you don’t mind if I reference this site.

  6. Awesome post and definitely brings up some good points. Hopefully Google will not go overboard with the “nofollow” attribute as a number of sites would loose MASSIVE income. Also, the sites that generally afford good links also have good content. I hardly see a “shady” site or content in the top 50 results that isnt directly related to what I want.

    I do think Google should take a look at snap.com and apply some of their features. The feature I really like is having users “rate” results.

  7. Lets face it Carsten,

    -PR is overated
    – the Google algo relies too much on backlinks
    – Google can not differentiate between a legitimate authoritative sites and those that are buying links for their MFA sites

    Google has a tool bar and analytics at their disposition. Google can monitor user activity to check who bookmarks sites, who prints articles, who forwards articles, and a slew of other activities to more accurately identify important sites.

    I agree with your comments on verticals. It is the way of the future. I have recently blogged about verticals and the ineffectiveness of current turn-key directories.

    http://www.youshouldknow.com/directories/directory-owners-%e2%80%a6-incredible-bright-people-doing-incredible-stupid-things-25.html

    Lets give Matt Cutts and Google a little credit, just a little, we need to curb the illegitimacy of many sites who’s PR is purely reflective of paid links.

    I expect links on social sites such as Digg, Technorati and stumble to be devalued in the near future.

    cheers and keep up the good word!

  8. I still recommend a new attribute for paid links. Rel-PAIDLINKS. Buying and selling links is something that will prevail on the Internet, and not just because it boosts SE rankings. Purchasing a good targeted link makes sense from many different angles.

    In respect to a Search Engine, a new attribute for Paid Links will improve the knowledge available to SEs on what is going on. Then, they can determine what they want to do with the value assign to those links.

    For example, in my hypothetical Search Engine, I would evaluate the attribute PAIDLINK and compare a ratio of paid links to no paid links, followed by the PR of the links, followed by the “trustrank”, followed by relevance of those links to the web page, followed by the, etc. etc. If that PAIDLINK attribute wasn’t there, I’d be in the dark as to the reason for those links being in that web page.

  9. Google is pitting the search engine spammers against one another. Probably a smart move, letting the spam competitors serve as police against one another.

  10. Google Adwords is a short term advertising service.

    Google will have to start offering long term advertising services.

  11. It was discussed alot about lowering value of links with contextually non-related resources. But what’s for pretty looking links “in theme”? How to determine that?

  12. Fast Test Geek,

    Thanks for the comment.

    I think that all search engines get better and are already doing a quite good job in semantically determine if page one is related to another in topic (not purpose).

    Purpose or intend is something that can not be determined that well using semantics, except for straight forward link anchor text like “buy v*agra” (hehe).

    That is the part where they could get help from webmasters. Nofollow and how they want to use it, is no help, but the opposite. See post III for some of my suggestions.

  13. I am hoping that Google’s “warning” serves as just that — something being said to get people to examine the way that they do business. Unfortunately, link farmers aren’t the only ones who will suffer as legitimate sites will be hurt too.