SEO

Site Speed May Soon Affect Google Page Ranking

In a recent interview with Google Engineer Matt Cutts, Web Pro News is reporting that Matt  may have hinted on the future of how Google may  rank websites. And guess what? It has something to do with how fast a website actually loads.

During the interview Matt told Web Pro News:

“I think a lot of people in 2010 are going to be thinking more about ‘how do I have my site be fast,’ how do I have it be rich without writing a bunch of custom javascript?’”

Of course it is a given that the speed by which your sites load matters a lot to your visitors. Slow loading sites drive away potential visitors while fast loading sites encourage them to stay longer on your site. And this is what Google has been advocating recently – making the Internet experience of users a little bit faster.

It is not yet clear how Google is going to implement site ranking based on site speed, if indeed there is such a plan.

But just in case it pushes through – are you ready for some site speed optimization? SSO side by side with SEO might just be the norm in the coming days, don’t you think so?

 Site Speed May Soon Affect Google Page Ranking
Arnold Zafra writes daily on the announcements by Google, Ask.com, Yahoo & MSN along with how these announcements effect web publishers. He is currently building three niche blogs covering iPad News, Google Android Phones and E-Book Readers.
 Site Speed May Soon Affect Google Page Ranking

You Might Also Like

Comments are closed.

16 thoughts on “Site Speed May Soon Affect Google Page Ranking

  1. From a Google perspective, it’s smart – it’s going to hurt the Microsoft world. Most ASP.NET developers are software developers, not web developers – they either don’t understand or don’t care about the fact that excess whitespace slows stuff down, so they write JavaScript using C# syntax, full of pointless whitespace.

    Microsoft has made the problem even worse, with the default setting for Visual Studio putting in four spaces instead of a tab when you hit the tab key. ASP.NET developers are doomed to be slobs before they even start.

    Even real web developers in the ASP.NET world, who make efforts to write minimal code, will still be at a disadvantage because the Microsoft-generated JavaScript (that can’t be modified) is bloated crap, written by morons who are clueless about client-side code.

    Also, IIS doesn’t have compression turned on by default, and since most Microsofties don’t give a crap about client-side code, they rarely think to turn it on.

  2. it is not about mass of code, it is mostly about server. I moved website from shared hosting (where at least 120 other websites was hosted) to dedicated server… that helped ALOT! Then we hit top because $70 dedicated server was not enough to handle increasing traffic se we went to top of the line machine $250 and finally when hit top again we went to 2 web servers [$200 each, load balancer dedicated $70 and database server $250 plus spacial deal for 15TB of traffic… and when time for fetching of our pages goes up to .3s we will add more power.

    Also we do not have funny loops and kilos of garbage in css, javascript and html.

  3. I find this to be good news I find it frustrating when a website is extremely slow. I’ve always been willing to pay for hosting… and see this as soemthing that will help our sites rank well.

  4. Ok. Its means that Flash based website which usually load slow are in more trouble in coming days..off course website loading speed is dependent on Sever infrastructure but some websites are also heavy!

    Its not a good news for small website owners who cant afford to buy dedicated servers or CDN to support their website. It seems like google is planning to give an advantage to Big Website owners/Brands!

    1. I don’t think whether or not you’re using a CDN is going to be a huger factor or not. I don’t think it’ll hurt, but if you look at all the apps that measure website performance, most of them have 15-20 measurements and usually only one of those is related to CDN.

      If you take tools like YSlow, Google Page Speed, or webpagetest.org and measure a site, it’s almost guaranteed that you’ll find something you can improve. Many of the improvements are actually pretty simple, although some will require a new approach to development, and hopefully web designers will use the tools to improve the design process.

      I’m quite excited about this. Google has the power to help change the web for the better. Ranking faster sites higher is a good first step, hopefully we’ll see this extended to rank sites with valid HTML higher too.

  5. Whitespace is not much of an issue because with DEFLATE/GZIP turned on, it compresses to almost nothing.

    With its own backbone, Google is well positioned to assess the availability and response times of websites. Response time has three major parts: 1. DNS resolution time; 2. Time it takes to open a TCP/IP connection with the server and; 3. The time it takes the web server to respond with the content. We imagine Google is looking at all three things – particularly #3 but #1counts too.

    Once Google gets the content they can measure the size: Is it deflated? A newbie web master will forget to compress content. If I was Google, I’d ding a site that wasn’t compressing. It’s a service that the host really should be providing to the customer, there’s very few reasons not to do it.

    And if I were Google, I’d try to see if the content on the page was really worthy of its size? For example, the page is 100K and the content boils down to basically “click here for more details.” This is pretty subjective though, might be hard to write algo around this.

    Lastly, what kills speed is the *number* of requests per page. So if you have 10 linked external Javascript files like jQuery, etc and then you have 5 external CSS files and then 20 little pixel push images – but the time the page has finally loaded, the Customer has done 60 to 100 hits just to pull down one page. Of course when Google loads the page I believe it only pulls down that single page, not all the linked CSS, images, etc.

    I believe Google will start looking at not only the response time of the page and it’s size, but maybe more importantly the number of external files needed to be loaded to have the page fully rendered. As the number of external files grows, it starts to look like a site is just doing shovelware, just throwing garbage on the page – maybe the page will use it – maybe not, but the web master is too busty to clean it up – so just keep throwing more Javascript and CSS up there. It can be a sign the site doesn’t really have their A-game on.

    CDN or not, it will still count as an external call – I doubt if Google cares if your content comes from a CDN or not. They might ding you if it’s a CDN actually, I mean, a lot of stuff on CDN is advertising – and who wants to look at that?

  6. This is a really interesting development and an indication that Web app performance is of growing importance to users and the marketplace.

    I wonder if Google will start off with the basic page load guidelines implemented for Google Adwords last year. Here is a link to the description of how that is currently utilized in Adwords: http://adwords.google.com/support/aw/bin/answer.py?answer=93113

    I read the original interview with Matt Cutts that is referred to and Matt suggests that Javascript may also be a part of the grading system which indicates client-side performance will also be evaluated.

    Interesting developments that continue to point out that measurement from anything other than a real web-browser is ineffective.

    Ken Godskind
    http://blog.alertsite.com
    @AlertSite_CSO

  7. I am always eager to hear about how Google looks at the world from the perspective of page ranking. But this is a pretty poor excuse for evaluating a site. Perhaps Matt Cutts is merely floating a “trial balloon”. How does Google know that speed is important over content?
    Thanks for sharing.

  8. I listened to the interview, and Matt pointed out a couple of sites and Firefox plugins that can be used to look at site speed. What jumped out for me most was
    ““Alot of people within Google think that the web should be fast, it should be a good experience, and so its sort of fair to say if your a fast site, maybe you should get a little bit of a bonus, or if you have really awfully slow site, then maybe users don’t want that as much.””
    A little bit of a bonus.. hmm, and he also mentioned that speed may be used as a metric for Google Adwords quality scoring..
    Should be interesting on how this plays out..

  9. I think a speed metric within Google’s indexing algorithm is a very positive thing.

    It will encourage companies to think about speed and discourage them from using excessive amounts of 3rd party media such as poor performing adverts and flash based content, etc.

    I’m sure the speed metric will only count as a small plus towards page ranking and content will always be the primary measure.

    At http://www.getmecooking.com we are implementing features to make the site as fast as possible – much faster than other large cooking sites. See how fast the recipe page dynamically loads 20 recipes at a time. Other sites such as http://haystack.com also do this.

  10. Improving user experience is a desirable goal both from Google’s and a site-owner’s perspective.
    However, this move is weighed heavily in favor of the large site-owners, who have full-time webmasters working for them. These professionals can improve the code and manage the site.
    For small site-owners, speed improvements may cost more, as they would have to increase server speed or hire coders to make changes which may be beyond their abilities.
    The good news however is, that site speed will only be one of the 200-odd factors that Google already takes into consideration while ranking pages.
    Hence, while site speed is both important and desirable from the user experience perspective, slow sites may still be able to rank high on Google, based on other factors.

  11. Google encourage poor contents when there are tons of good links pointed to it artificially.

    I am really for high quality contents and beleive me I am frustrated when I see one of my unique high quality article ranking below poor content article, just because the webmaster pointed a couple hundred backlinks or more.

    When you check e-commerce and see many comparative sites doing this, or competitors ranking better because they use stuffing keywords, doorway sites and sometime manipulate your own keyphrases to unrank you in SE.

    I guess you just have to continue what you think is best for your site, and let go even if is frustrating.

    my site: http://www.adnpost.com

  12. Does this mean that switiching my site over to a dedicated server from a shared server may increase its position in the serps?

  13. Does this mean that switiching my site over to a dedicated server from a shared server may increase its position in the serps?