Google Loves BOTW – You Should Too
The last week has felt like a trip in Bob Massa’s Magical SEO Time Machine as Directory listings, directories and seo relevance and trust building links have taken center stage in many a search engine discussion. One of the latest SEO Excellent Adventures is taking place on Shimon Sandler’s blog with an Interview with Best of the Web‘s Greg Hartnett.
Seems that in a world of listless PHP Link powered directories popping up here and there, taking whatever submissions and links they can get for $2 or $5 a pop, directories such as Best of the Web, Rubber Stamped, Gimpsy, Joe Ant and even Seven Seek & WowDirectory (which is offering $5 Express Reviews this week) are really starting to shine again.
Speaking of Best of the Web, my favorite part of the interview is the discussion of how important it is to submit to the large human edited trusted directories which have been around for a while. BOTW has been in business since 1994 AND (and I never realized this until the interview) was cited by Larry Page and Sergey Brin for its quality of search:
Q: What was BOTW’s involvement with the thesis “Backrub” by Sergey Brin and Larry Page?
A: In the paper “The Anatomy of a Large-Scale Hypertextual Web Search Engine”, Page and Brin cite BOTW in regards to quality of search. So, we had pretty much no involvement, save the role of grateful recipient of the citation ?
So, time to dig into Larry and Sergey’s old Stanford papers…
1.3.1 Improved Search Quality
Our main goal is to improve the quality of web search engines. In 1994, some people believed that a complete search index would make it possible to find anything easily. According to Best of the Web 1994 — Navigators, “The best navigation service should make it easy to find almost anything on the Web (once all the data is entered).” However, the Web of 1997 is quite different. Anyone who has used a search engine recently, can readily testify that the completeness of the index is not the only factor in the quality of search results. “Junk results” often wash out any results that a user is interested in. In fact, as of November 1997, only one of the top four commercial search engines finds itself (returns its own search page in response to its name in the top ten results). One of the main causes of this problem is that the number of documents in the indices has been increasing by many orders of magnitude, but the user’s ability to look at documents has not. People are still only willing to look at the first few tens of results. Because of this, as the collection size grows, we need tools that have very high precision (number of relevant documents returned, say in the top tens of results). Indeed, we want our notion of “relevant” to only include the very best documents since there may be tens of thousands of slightly relevant documents. This very high precision is important even at the expense of recall (the total number of relevant documents the system is able to return). There is quite a bit of recent optimism that the use of more hypertextual information can help improve search and other applications [Marchiori 97] [Spertus 97] [Weiss 96] [Kleinberg 98]. In particular, link structure [Page 98] and link text provide a lot of information for making relevance judgments and quality filtering. Google makes use of both link structure and anchor text (see Sections 2.1 and 2.2).
So, there you have it.
Now it’s time to cough up $150 to BOTW to have your site reviewed in hopefully listed in the directory which Google has considered to be of high search quality for almost 10 years.