Google’s Patent Application Analysis
Google’s Patent Application contains a lot to read and reading it may take some time, but if you own any type of website, this is all information you need to know. It also brings some interesting points up. While I go over some of the important points, know that no one knows which of these factors is given more weight than the others.
Domain Name Registration – Google is now going to track when a domain is registered among other things. An older domain will get a higher ranking. No more throw away domain names. No more jump to the top of Google results in thirty days.
They will also be tracking the length of renewal on the theory that a person that renews for ten years will be more likely to build a worthwhile site than someone who only holds their domain for a year.
Google will also be keeping a blacklist of known spammers and will be using this list when checking dns records of websites. So spammers who make sure to get their new throwaway domains with different nameservers in order to throw Google off may have to try something new.
Google Spyware? – They are using “user behavior” to rank sites. In my book, if spyware removers try to remove Alexa every time I run it, then this function of the Google toolbar can only be called spyware. Yes, you may check the box on the terms of service for the toolbar, but it still tracks your internet browsing.
But, I think the theory will make search engine results much better.
Google will be tracking the number of times a document is selected from the search engine results. This is a great idea. It means you now have to write the titles of your pages to grab the searcher’s attention. And since the search terms are highlighted in the results, maybe placing them at the beginning of sentences in your page may make then stand out due to capitalization. But I also see a way that this can be spammed by a network of “search and click” spammers.
They will also be tracking the amount of time a person spends on the page that they find. I don’t know about you, but I have been around long enough to notice a spam page and I am gone in two seconds. This may help drop them out of legitimate results.
Content Changes – I think this comes down to just updating your information the way it should be updated. If you have a forum that hasn’t been active in a week, the one that is very active with new posts every minute will definitely rank higher.
But the document also mentions that some stale sites may not be ranked lower if not updated that much. For example, a site on the Civil War will not be expected to change as much as a news headlines site and an older, more stable site may get the rank boost.
Query Analysis – A search for “American Idol Winner” will produce different results than it did last year, even if a page on last year’s winner has more links pointing to it.
Google will be following trends by the increase or decrease in the usage of certain search terms or phrases. I am not sure how this will be implemented. Will there be a quicker ranking algorithm for new trends? Or will sites that have a tendency to break new topics get top billing for such terms?
The search engine will also be sensitive to terms that could be used for different subjects. When you search for “Deep Throat” are you looking for Mark Felt or a Linda Lovelace movie? Google will track what searchers are actually looking for and changes in searching trends.
A Google Browser? – Google also says that they will attempt to track bookmarks and favorites files along with cache files to help determine the ranking of sites. The only way I see this happening is through their own browser and again, this brings up the question of spyware.
Topics – Pages will now be tracked for the topics they cover. Maybe this is what Site Flavored Search is all about. Google says that changes in topic will traced for scoring. So a drastic change in a site may drop in down in the search results. I think this must already be in effect, just for some of the things I have seen with my own sites.
Anchor Text – Google says that links to pages from other sites tend to have differing anchor text if they are obtained naturally. Atrificial linking campaigns tend to produce anchor text that is the same.
Anchor text that changes when the page the link is on changes will be counted as being more relevant.
Anchor text that changes with time may indicate a change in topic on the site.
Anchor text that is no longer relavant to the site linked to may be discounted.
Traffic – Google will track traffic to a page to determine if the content is stale or not. This is a cue that sites will no longer be create and forget. Google will also factor in Advertising traffic.
Linking – Google says that legitimate sites attract links back slowly. Whether this is true or not depends on the definition of “slowly”. I know of sites like stumbleupon.com, where users comment and rate sites constantly and one site sent into the mix can get hundreds of links to it within a day just from comments posted about it.
Google also says that exchanging links, purchasing links, or gaining links from documents where their is no editorial discretion are all forms of link spam. Does this mean that if you link to someone and they link to you, that is spam? Than a lot of bloggers out there who aren’t really trying to spam may get accused of doing so.
They will also be measuring the authority of the page that the links are on, mentioning government documents specifically. This smacks of information control. Who assigns this authority and what makes one person more of an authority of another? If a political issue is searched for, will a Democrat’s or a Republican’s page come up first?
The freshness of the page that the link is on will also help determine the freshness of the linked-to page. This is a good argument for using a blog and pinging after your entries.
A page that is updated while the link on that page remains the same is a good indicator of the relevant link.
Ranking History – Ranking change is another feature that Google will use to detect spam. Not that all sites will be flagged as spam sites if they see a huge jump in ranking. Some of these sites could be topical. The authors of the site may have caught onto a new trend just as it was rising.
But Google also will measure the change in a sites ranking to determine if the content is becoming stale, i.e. a drop in links to the site.
Now this must mean some sort of balance and I hope they have leeway for traditional SEO. For example, If you have written new software and have created a PAD file for it, you can literally get hundreds of new links in a weeks. It only takes a second to submit.
What about if you started your own affiliate program. You can get a lot of links quickly that way? Will Google see this as spam? We will have to wait and see.
Finally Hope – Competition always inspires a better product and more options for internet users. Despite the focus on Google in search engine forums and its name being used to define “search for something on the internet”, i.e. I Googled him, Google hold on the market has actually dropped.
When once you could optimize for Google and leave it that, now the combined use of MSN and Yahoo is greater than Google, with Yahoo nipping at Google’s heels.
This leaves options for us as search engine marketers and internet searchers. If one search engine doesn’t suit us, at least we know that it isn’t the only one we have to choose.
Columnist Stephan Miller Blogs at http://www.stephanmiller.com