Home > Misc > Killing Two Birds With One Stone: Why Page Speed’s Influence On PageRank Is Strategic To Google

Killing Two Birds With One Stone: Why Page Speed’s Influence On PageRank Is Strategic To Google

November 28th, 2009

iStock_000009182804XSmall-fiber-world

Final edit updated Nov 28, 2009 22:40

Random Connections

Some days I’ll be thinking about a topic, and suddenly an unexpected connection appears. That just happened as I was reading two posts, one about potential changes to PageRank and another about a new protocol that Google is pitching to speed up web page loads. The PageRank discussion seemed to be completely about the marketing impact of this change on search engine optimization strategies, while the protocol discussions were centered on technology and global standards issues. It seems that these two apparently separate things may be very tightly coupled indeed.

PageRank, Page Speed And Marketing Effects

PageRank is supposed to measure the authority of a page, to determine whether
it is “the best” trusted and primary resource to answer the user’s search query. It is hard to fathom how response speed has anything to do with authenticity, authority, correctness of data or trustworthiness. It seems to be much more strongly linked to the cash position of the person or organization who authored the page, and how much they have invested in having the right page design, hardware and network infrastructure (either their own or hosted) to provide fast response.

One quasi-altruistic reason for adding or emphasizing page load in the PageRank calculations may be to nudge the web community to think more about page load speed when designing sites. However, sites that cause high abandonment rates ought to be somewhat self-correcting by themselves, not needing such external pressures to cause a strategy rethink.

In Should Page Speed Influence Google PageRank, Om Malik discusses whether emphasizing page load speed in PageRank is a good idea. Matt Cutts, who works at Google, hints that over time page speed might be more important in PageRank. I think that the deeper motivation for this emphasis lies elsewhere.

SPDY – A Technical Thing That Could Speed Up The Web

Google has lately been pitching their SPDY protocol, a protocol that aims to reduce page load network overhead by 2x, in parallel with some apparently similar IETF work. The SPDY protocol page implies that Google’s goal is to to replace http with SPDY.

The code behind SPDY is supposed to be open in some fashion, but so far as I know it is not part of the IETF standards process. I wonder if specifics of SPDY are optimized for Google’s search infrastructure, such that they would get a differentially faster result from SPDY search clients than other search vendors could achieve (at least until they reverse engineer the implications – which they perhaps already have.)

Maybe SPDY Is Not Just Altruism

What could Google accomplish if SPDYsignificantly displaces http? Oops, did I say that? How could Google pull this off, given the dispersed and voluntary nature of most standardization efforts?

What if they realize they can use their PageRank search algorithms to convince people to implement their strategic vision? Insiders might see this as an “end-run” that bypasses Internet standards bodies as opposed to “just capitalism at work”

Well, if Google is thinking about SPDY in this way, what might we see them do to influence the end game? Maybe a tweak to PageRank so that page load speed is a strong enough factor that you could not maintain your current SEO ranking without implementing SPDY. Maybe just enough so that, if just one of your competitors implemented SPDY, they’d outrank you in Google search results—unless you had a very strong advantage in some other SEO category.

If SPDY generates strong benefits in real life operation, then with just the right amount of emphasis in PageRank’s calculations, that competitor could instantly displace your SEO position. Such an overnight change of search positioning might cost large companies millions of dollars.

Watching The Dominos Fall – A Likely Scenario?

If you find your company instantly displaced in the Google search rankings by a competitor who chose to implement SPDY, would you hold to principles of openness and standards and stay with http? I’m guessing you’d get SPDY up and running as quickly as humanly possible. Thus it starts: all of the other dominoes topple in sequence and SPDY becomes the global de-facto standard.

If this scenario plays out, Google ends up with a quasi-open protocol that is the de-facto standard, to which they hold the design and evolution keys. With PageRank in hand, and a completed demonstration that slight changes in algorithm can cause the global business community (and probably everyone else) to follow their technical lead, will they continue to hold to the company’s credo of “Don’t Be Evil?”

Maybe It’s Three Birds?

As a side thought, smaller organizations may not be able to pay to compete on page load speed. Being on page two is pretty much “not present in Google.” Depending on the size of their user base and hosting arrangements, they may opt to spend more on AdWords to try to improve their search results placement.

When We Are The Network discussed the topic of whether Google could or would maintain its “Do No Evil” ideology I didn’t anticipate that I would stumble on something else to write about on this topic so quickly. Hopefully I’ve missed something obvious, and this end game is not as probable as it seems to me at the moment.

I look forward to your comments, pointers and clarifications.  Thanks for taking the time to read through to here, as I realize this has been a longer than normal post.

  1. November 28th, 2009 at 20:42 | #1

    On the other hand… doesn’t newly-credible competition from Bing give Google an incentive to consider speed? If people click on multiple sites from search results, and one search engine ranks on speed, that search engine will feel faster – even if the search results themselves aren’t.

    For that matter, there might be a simpler incentive: Spidering cost. I have no idea what Google’s back end looks like, but presumably each open socket has some bit of state attached – even if only at the router level – and Google’s big enough to attach a dollar figure to that state. If they can get the web to load pages 10% faster, they can crawl it more cheaply.

  2. Cindy Harris
    November 28th, 2009 at 22:24 | #2

    I can think of a number of reasons that Google would be interested in encouraging a protocol that enhances page load speed. I like Jay’s “spidering cost” suggestion — that’s definitely a big deal at the scale Google operates at, but I was thinking this true for a different reason than Jay cites. With trillions of pages out there and the web constantly expanding at an exponential rate, if page downloads are not fast enough, there will eventually be pages that Google’s robots would not be able to catalog in a timely fashion. Increase the pageload speed across enough of the web (in this case by promulgating a new protocol) and you’ve not only increased the size of the Google data asset, you’ve also improved its currency (and usefulness) by allowing more frequent updates of pages web-wide.

    This increased currency/usefulness of the data is actually closely related to what I think is the more “straight-ahead” explanation of why Google might want to encourage a quantum improvement in page load speed. Consider an individual who is searching for information. She sits down and types her query into Google, and results come back, thousands of them. But she has only limited time to check them out and decide which ones are useful to her. Google has a direct interest in this individual leaving at the end of that time “satisfied.” If they can significantly improve page download speed across a significant number of search results, they increase the number of “results per minute” that a user can evaluate, increasing the chances that she will (a) leave satisfied and (b) come back again even for the most insignificant queries. I was thinking that in this scenario you’re potentially increasing “stickiness” to the results page too, but I think what I really mean is that you’re increasing “long term stickiness” to Google search, the probability that a user will return more often than they might if the response time was not so snappy. So not more time per interaction (the user has only so much time to spend) but a larger number of interactions. You may also increase the viability of “second page” results: if a user can scan more results-per-minute, then in a given time, they can scan more pages of results, and that means potentially more space for unique ads and Google ad revenue.

    There are some other ideas that are floating just out of reach of my conscious mind right now, but if/when they surface I’ll try to remember to post them. And I’m also noodling about the question we discussed a couple of weeks ago: if all this is true, does that make Google “evil”?

  3. November 29th, 2009 at 00:42 | #3

    These are all good points. Thanks!

  4. November 29th, 2009 at 08:44 | #4

    @Cindy Harris Good point about the completeness of their catalog. I guess it shows how far we’ve come that we now presume Google can catalog everything – or that “everything”, by definition, includes only what Google can spider!

    As for evil: Google seems to be stepping away from that motto. As Marissa Mayer said in an interview, “it really wasn’t like an elected, ordained motto.” It’s something a Googler said at a meeting, and we all picked up the meme.

  1. November 28th, 2009 at 23:43 | #1
  2. November 29th, 2009 at 00:32 | #2
  3. November 30th, 2009 at 03:36 | #3
Comments are closed.