The impact of the election on Twitter’s web performance

by | November 7, 2012

The Twitter Election. That’s what everyone’s been calling it. Twitter has ushered in a whole new era of politics. The 2012 presidential race is the first in which voters could interact directly with the candidates (if not, their spokespeople); the first in which candidate triumphs and blunders were relived and exploited repeatedly in the days and weeks leading up to the election; the first in which voters revealed their candidate selections en masse even as other voters were still heading to the polls.

At the time of the last presidential election, Twitter was still a toddler. The site had launched two years before and, though growth was rapid (accelerating from 400,000 tweets per quarter in 2007 to 100 million tweets per quarter in 2008), Twitter had yet to become the prolific, real-time news source it is today.

Fast-forward to 2012. Twitter has more than 500 million registered users, 140 million of which are located in the U.S. And as of June, the site was seeing more than 400 million tweets per day. With that kind of volume, forget about measuring tweets per quarter!

Over the past few weeks and days, as the candidates made their final pitches and battled each other on the debate stage, millions of Americans took to Twitter leaving many to wonder if Twitter’s infrastructure was sound enough to handle the election activity. During the first presidential debate, the site served 10.3 million tweets (at the time, a new record) and subsequently experienced page load delays.

Yesterday, however, during another record-breaking day for Twitter (20 million election-related tweets), we saw no such delays Twitter maintained perfect availability and the quick, three second response times it has recently come to be known for.

This presidential election once again brings into focus the growing reliance on Twitter and other new media sites not just for social networking, but for the real-time consumption of news and current events. Last night, as with other critical moments in recent U.S. history, news broke faster on Twitter than it did on traditional news outlets.

Arguably, immediacy of information is the primary factor driving this transition, but website performance could play a part as well. While Twitter and even Facebook (with 100 percent availability and 2.35 second response times) delivered a strong, consistent performance yesterday, some of the traditional news outlets did not fare as well. A few examples:

  • On average, the response time for NBC News between Nov. 2 and Nov. 6 was just under 19 seconds. But at certain points throughout the day yesterday, response times spiked into the 20 to 25 second range.
  • ABC saw an increase in response times as well, jumping from 10 seconds in the days prior to the election to above 25 seconds on Election Day itself.

These specific issues appear to be related to the massive amount of content — third-party and otherwise — hosted on these sites. Web components such as advertisements, images, and social media could cause traditional news sites to load more slowly than sites with a more basic design (Twitter, for example).

Which brings into question a whole other issue that we will cover here in the coming weeks: Should websites be built for design or usability? And how might your answer to the first question impact web performance?

Apica Product Team

Author