It’s interesting to analyze current affairs from a technical perspective. The launch of the HealthCare.gov insurance exchange website presents a teaching moment on the subject of website optimization.
I figure, anytime web design makes the news, it’s a good thing for the website design industry as a whole, regardless of context.
[ Update: you may also be interested in the follow-up post, HealthCare.gov and the Importance of Usability Testing.]
HealthCare.gov: the Federal health insurance exchange website.
You have heard about the health insurance exchange websites that recently went online pursuant to the Affordable Care Act, commonly known as Obamacare. You may have heard that some of these websites were swamped with traffic to the extent that they became unusable. According to media reports, this was especially true of the Federal exchange website, HealthCare.gov, which serves the residents of all the states that have opted out of the ACA’s Medicare expansion provision.
Reuters reported that the HealthCare.gov insurance exchange website received visits from 8.6 million unique visitors in its first four days online. That’s an immense amount of traffic for any website, much less a website that was just recently launched. But the article also described how part the HealthCare.gov website’s problem with poor performance was due to its system architecture. In the article, Reuters interviewed a web designer, who noted how many resources were downloaded during a single visit to the sign up page.
Now, best practices for website optimization are exactly the sort of geek stuff that I spend hours obsessing about, when I work on client projects. So this news article got me interested, and I thought I would check out the Federal exchange website myself.
But first, let’s review some website performance optimization basics.
Website optimization best practices
These are the known best practices for website performance optimization:
- put the CSS in the head
- minimize HTTP requests
- minify your assets
- Use compression handlers like mod_deflate
You get this advice from the YSlow plugin for Firefox (it will also try to get you to use a Content Delivery Network) as well as from countless articles and posts by web professionals.
The Reuters analysis
Reuters described several problems with the HealthCare.gov website. The most glaring one was its sluggish, unresponsive performance in the first days after it launched. The article states that in response, officials added servers and made some other improvements to the network architecture.
Modern browsers now have developer tools built right in, so anyone can use them. You don’t need any plugins or anything. Just right-click anywhere within the browser window and choose “Inspect element.”* This will open the DOM Element Inspector panel, and shows how CSS rules are applied. Next, choose the “Network” tab. Then visit a page. Your browser’s network requests, and the response from the server, will be shown, along with a convenient timeline. This is how website responsiveness is measured.
*Requires a mouse: this will not work on your mobile phone.
A visit to the HealthCare.gov website
So I open up my network connection monitor and I load the homepage for HealthCare.gov. Here’s what it looks like:
I agree, there is a lot of extra stuff in there. How many analytics scripts does one website really need? They’ve got Google Analytics, some kind of Yahoo widget, the Chartbeat tracker, and a tracking beacon from Pingdom, just within this one screenshot.
But I do have to say, for having expected a slow page load time, I’m duly impressed. On my initial visit, the site served me 651kb of data, which my browser loaded in just 2.08 seconds. On a return visit with some assets cached, I get a page load time of 1.02 seconds. A page load time of 1.02 seconds is the kind of performance that most small business owners only dream of. Mere mortals generally can’t afford the kind of network architecture required to achieve speeds like that.
First impression: the front-end designers have done very professional work. The network engineers have done very professional work. But the developers who put it all together were unable to rein in the marketing people, and as a consequence they ended up with some unnecessary cruft. Even so, the load time of the home page is admirable.
Let’s dig deeper.
Since it’s hosted off-site, this type of analytics script shouldn’t directly affect the performance of the website. However this website does use several of these scripts, so visitors with slow connections and old, underpowered computers (or mobile devices) could conceivably have to wait longer while their device processes all those scripts.
[On a return visit, I also notice a substantial network delay during a request to log.optimizely.com before the “Apply Now” button is ever displayed. If a substantial delay is being caused by a third-party marketing partner, it might be time to re-evaluate that vendor’s role in the signup process.]
All right, but I’m looking for a page with 90 HTTP requests. I think this was the page Reuters was talking about:
This page took much longer to load. It took almost six and a half seconds to render, and made 89 separate HTTP requests for various assets. Of those 89 requests, no less than 10 of them are tracking scripts served from 3rd party domains. The remainder of the scripts are about evenly divided between functionality and presentation. I do see a script related to the CMS core, which seems a bit odd since I’m on the public-facing site, not the CMS admin. There’s a lot of form validation and security utilities, including client-side cryptography. I see the Twitter Bootstrap stylesheet, and the mobile-responsive version of the Twitter Bootstrap stylesheet (the observant reader may notice that Mardesco‘s website employs these as well).
Does the HealthCare.gov insurance exchange website employ website optimization best practices?
This is really the question, isn’t it. Let’s go through them again:
- Put the CSS in the <head>
- Yes. The CSS is at the beginning of the document where it belongs.
- Minimize HTTP requests
- Minify assets
- Use server-side compression.
- Yes. The server is running Apache, and implements gzip compression, as you can see from the header of the server’s response to a request for the signup page itself:
HTTP/1.1 200 OK Server: Apache Content-Type: text/html;charset=UTF-8 X-Frame-Options: SAMEORIGIN; X-Server: WS04 Vary: Accept-Encoding Content-Encoding: gzip Expires: Sat, 12 Oct 2013 03:01:15 GMT Cache-Control: max-age=0, no-cache, no-store Pragma: no-cache Date: Sat, 12 Oct 2013 03:01:15 GMT Content-Length: 15345 Connection: keep-alive Set-Cookie: akaau=1381547475~id=08fb5d331c6d840b811e8b53fba64801; path=/ Access-Control-Allow-Origin: *
All that said, a 6.4 second page load time is hardly worthy of a news story on Reuters. I think there are probably several explanations for this.
Firstly, the article itself mentions that the network administrators have been adding servers and other resources since the site went live. This appears to have had the effect of improving the site’s performance.
Secondly, I visited the signup page, but I did not submit the form. Most of the pages and assets I saw must certainly be cached on the server, and can be sent out instantaneously when I land on the page. By contrast, when the form is submitted, the server must parse the data, and probably queries another remote server over the network. Even though the page’s external HTML frame does not have to be reloaded, the page contents must still be generated dynamically on the server side, to display the results of the operation. Even under ideal conditions, these types of operations can sometimes take a second, or even two. A server under a heavy load could conceivably take quite a long time to return dynamically generated page content under these conditions.
Thirdly, it’s no longer the first few days the health insurance exchange website has been open. Much of the initial rush may have subsided by now, and as I view the website, it’s no longer experiencing the strain of a heavy server load, allowing all those network resources to really push some data.
The Reuters article also mentions some coding issues:
Many users experienced problems involving security questions they had to answer in order to create an account on HealthCare.gov. No questions appeared in the boxes, or an error message said they were using the same answers for different questions when they were not.
This is clearly a bug. Bugs are caused by programming errors: either a flaw in the logic, or else an actual mistake in the code. Programming errors happen all the time, and that’s why debugging is a huge part of application development. While a flawless deployment would have been preferable, professionals agree: These things sometimes happen with the initial deployment of a large, complex system. Unfortunately, bugs don’t all get caught during the pre-deployment testing phase, and then they must be corrected in a big rush when they are brought to the sysadmin’s attention by aggravated users.
I also note a <noscript> element in the page’s source code. It says,
It was all a bit of a disappointment, really. After being scared by the frightening media reports, I’d been looking forward to sitting through a whopping 30 or 40 second page load time, and then trying to figure out which assets were really bogging down the system. Then what do you know, the pages went and loaded in just a few seconds flat, leaving me with very little to complain about.
Overall, HealthCare.gov appears to be faster than a lot of commercial websites. (I’ve met some real sluggish databases, although I probably shouldn’t name names here.) I didn’t actually go through the entire signup process as part of this analysis. The part that I saw appeared to be fully functional, despite media reports. In fact, I thought the user interface was quite friendly.
[ Update: I see from HealthCare.gov’s Google+ profile that they are continuing to work on the system. I also see from user comments that the system presents less errors to visitors who are browsing on Microsoft Internet Explorer. Really? Figures…]