How Efficient is Digg?

SMS Text

You would imagine that given the limited venture capital budgets of social media sites like Digg.com, they would be scrutinizing every cost-saving measure to ensure that they are operating at maximum
efficiency. A few recent post on Digg show that this may not be true.
According to a post submitted by vicnick,

Digg.com is not gzipped. Page Size: 41 K, Size if Gzipped: 9 K, Potential Savings: 78.05% !!! That’s a potential bandwidth saving for Digg as well as the end-user.

I looked at the statistics from WhatsMyIP.com.

As the utility points out, Digg can save substantially (bandwidth-wise) by gzipping the content on the site, and even improve the site’s performance for people using dial-up modems.
That, however, is not the complete story. While you may save on bandwidth, two problems may arise.
1. gzipping can be very CPU intensive.
2. AJAX doesn’t play well with mod_gzip.
The first problem means that you may ultimately end up not saving much money from gzipping. Before you can determine the economically optimal decision, you have to determine how much you will save from bandwidth-related costs and add to that how much additional investment you will have to make in hardware to get the additional processing power. The second problem means that you wouldn’t be able to use any AJAX related enhancements on Digg (i.e. digg/spy).
Furthermore, fkr points out what may be potentially inefficient use of Digg’s 75 servers. According to Markus Frind’s calculations, Digg serves about 7 million page views a day, which comes to an average of 81 pageviews every second. Dividing those by the number of servers Digg has, he calculates that Digg is displaying 1 pageview per server per second. He ultimately concludes that,

I think digg.com wins the worst infrustructure/setup award of any major site hands down. If their [myspace] infrastructure was as bad as digg.com’s they would need 18,750 servers!!!

I would love to hear some more technical input on this, preferably from the Digg team.
The last story that addresses potential inefficiencies on Digg comes from Oatmeal, and his offer to save Digg a million dollars just by using 3 lines of code. Of course the title of the post is an exaggeration, but that is not to discount the point of the linked article. As pointed out by Matthew Inman, Digg is not using 301 redirects to make preferred canonical urls, and consequently their search engine rankings and the resulting traffic to their site is suffering.

The difference in click-through rates for the top three [ranked search results in Google] versus 4-10 are incredibly substantial. Click-throughs from Google mean more visitors to Digg from a broader audience. This audience might be inclined to click on some of your ads, meaning more money in your pocket.

The three lines of code that would fix this are,

RewriteEngine On
RewriteCond %{HTTP_HOST} !^digg.com
RewriteRule ^/(.*) http://digg.com/$1 [R=301,L]

Subscribe to SEJ!
Get our weekly newsletter from SEJ's Founder Loren Baker about the latest news in the industry!