Archives

This week’s sponsor: World IA Day

Posted by The fine folks at A List Apart |22 Dec 15 |

Join the fine folks at World IA Day on Feb 20, 2016 to celebrate advancing the practice of information architecture.

Mark Llobrera · Professional Amateurs: Write What You Know (Now)

Posted by The fine folks at A List Apart |17 Dec 15 |

Sometimes the writing comes easily, and then there’s the last two months. I really wondered if I had run out of things to say. I knew I wanted to write about how more web designers and developers need to write about their work. So I wrote a bunch of paragraphs down, and then on a lark (I’m not even a regular listener), I downloaded episode 110 of The Web Ahead.

I nodded along with host Jen Simmons and guest Jeremy Keith saying some very smart things about the web and its roots as the El train cut across Philadelphia. But at the 48-minute mark things got weird, because Jen and Jeremy basically started writing my column for me while I listened. Jeremy said:

I wish people would write more… In the future, we would have a better understanding of what people are thinking now. I’m very glad that I’ve been doing my blog for 15 years. I can go back to 2002 and get a feel for what it was like to build websites. Back then we thought X was true or hadn’t even considered Y. You forget these things. Having these written records—not of anything important or groundbreaking—but just the day-to-day. The boring stuff. That’s actually what’s most interesting over time.

The fear of stating the obvious is one of my primary personal roadblocks to writing. Jeremy’s words evoke Samuel Pepys’ diary, which is a famously important resource for historians precisely because it includes so much detail about life in the 1600s—including many items that he could have dismissed as being mundane and obvious.

Non-experts please apply

I appreciate a well-written, logically-structured, authoritative blog post about code as much as the next person. But I also have a love for the blog post that is written with a posture of humility: I know this much, so far. Or even: Why is this happening? Seeing your own stop-start journey through design and code reflected in someone else’s writing can remind you that it’s ok to not have everything figured out. Turns out nobody does.

Often when a teammate shows me something cool, at the end of our conversation I’ll half-jokingly say, “write it up.” I think that’s pretty good advice regardless of where you currently sit on the continuum of despair to triumph. Don’t even wait until you have everything figured out, at the end of the project. It’s ok to write what you know now, while everything is fresh in your mind and the sheer agony (or thrill of discovery) borders on the physical. If you’ve just solved a particularly flabbergasting problem, imagine yourself on the other side of that experience, just hoping that DuckDuckGo or Google will turn up a post with even the barest hint of a solution. If that were you, you wouldn’t care that the blog post wasn’t polished. You’d just thank your lucky stars that someone took the time to bang something out and hit publish.

Lies your brain will tell you

“But nobody will read it,” you say. That may be true. But the opposite could also happen. A few years ago I was interviewing for a job, and my future boss said, “I like how you articulate things. I’ve read some of the stuff on your blog.” I was so surprised, I didn’t even think to ask which posts she liked. For all I know it was the ones where I recount the silly things my kids say. That memory made me think of this fantastic interview with Ursula K. Le Guin, where she says, “There’s always room for another story… So if you have stories to tell and can tell them competently, then somebody will want to hear it….”

On the other hand, that might be exactly what you fear—that someone actually will read it. But that needn’t be an intimidating thought. Through your writing people can to get to know you, in a way that often gets lost in casual interactions with coworkers (or formal ones with a potential boss or client). Because if you read enough of a person’s writing, their voice comes through. It doesn’t matter if it’s just explaining a technical issue and offering a solution. Their quirks, their interests—these things bleed through if you read enough of their words.

Getting more adept at writing can help you communicate confidently in other ways too. In a recent ALA On Air live event, Jeffrey Zeldman emphasized the benefit writing can have on your problem-solving process:

I think if you can articulate your thoughts in writing, even though that’s really hard, you’re going to be better in meetings, you’re going to have a point of view; you’re not going to roll over and say, “The client said I should make the button blue and make it bigger, so that’s what I will do,” but instead will go, “What is behind the client’s request? What are we really trying to achieve?”

chmod 777

So now that you’re convinced, where do you start? You might start by giving yourself permission: “I allow myself to publish this thing.” This is one of those things that I’m still working on, myself. I’ve always found writing enjoyable, even easy. Publishing, however, is not. I’ve been blogging for several years, but my ratio of drafts to final posts is pretty dismal. This column has been good for me, because now I’m accountable to someone other than myself. I’ve got an editor that I don’t want to let down. I’ve got folks who actually read this column and respond. But maybe you don’t need more accountability. Maybe you actually need to lower the stakes, and give yourself permission to just get it out there. And again, Jeremy Keith said something along the lines of what I wanted to write:

The whole point of the web is there isn’t a gatekeeper. There isn’t someone with a red pen saying, “That isn’t good enough to be published. That’s not up to scratch. You’re not allowed to publish it.”… It could be the worst thing ever and you still have the right to publish it on your website. You should do it. Don’t let anyone tell you otherwise.

Holy smokes. I heard that, and I almost bulk-published my entire WordPress drafts bin.

Recently, CSS-Tricks ran a survey that asked its community to weigh in on topics that they face daily. I answered the survey, and one of the interesting results was for the question, “You’re stuck. You search the web. You prefer to find answers in these formats:”. The top answer was blog post. Blog post! One of the other leading answers was “Q&A format page” (something like Stack Overflow). That made me think. Why wasn’t Q&A the top answer? Maybe it’s because while web designers want something that works if we simply copy-and-paste, we are also driven by why as much as how.

Code has a story. One of my favorite posts to write (and read) goes something like: “This wasn’t documented anywhere I could find, and it’s such a weird situation that if I don’t write about it nobody would believe me.” I made a category on my blog just for those posts: Technology’s Betrayal. I feel like a web designer’s life is full of those little stories, every day. And usually you tell your teammates over lunch, or over a beer, and you laugh and say, “Isn’t that nuts?” Well, I’m here to say, “write it up.” Let someone else hear that story, too.

Interaction Is an Enhancement

Posted by The fine folks at A List Apart |15 Dec 15 |

A note from the editors: We’re pleased to offer this excerpt from Chapter 5 of Aaron Gustafson’s book, Adaptive Web Design, Second Edition. Buy the book from New Riders and get a 35% discount using the code AARON35.

In February 2011, shortly after Gawker Media launched a unified redesign of its various properties (Lifehacker, Gizmodo, Jezebel, etc.), users visiting those sites were greeted by a blank stare. Not a single one displayed any content. What happened? JavaScript happened. Or, more accurately, JavaScript didn’t happen.1

Screenshot of a completely blank website with only the Lifehacker logo displayed.
Lifehacker during the JavaScript incident of 2011.

In architecting its new platform, Gawker Media had embraced JavaScript as the delivery mechanism for its content. It would send a hollow HTML shell to the browser and then load the actual page content via JavaScript. The common wisdom was that this approach would make these sites appear more “app like” and “modern.” But on launch day, a single error in the JavaScript code running the platform brought the system to its knees. That one solitary error caused a lengthy “site outage”—I use that term liberally because the servers were actually still working—for every Gawker property and lost the company countless page views and ad impressions.

It’s worth noting that, in the intervening years, Gawker Media has updated its sites to deliver content in the absence of JavaScript.

■ ■ ■

Late one night in January 2014 the “parental filter” used by Sky Broadband—one of the UK’s largest ISPs (Internet service providers)— began classifying code.jquery.com as a “malware and phishing” website.2 The jQuery CDN (content delivery network) is at that URL. No big deal—jQuery is only the JavaScript library that nearly three-quarters of the world’s top 10,000 websites rely on to make their web pages work.

With the domain so mischaracterized, Sky’s firewall leapt into action and began “protecting” the vast majority of their customers from this “malicious” code. All of a sudden, huge swaths of the Web abruptly stopped working for every Sky Broadband customer who had not specifically opted out of this protection. Any site that relied on CDN’s copy of jQuery to load content, display advertising, or enable interactions was dead in the water—through no fault of their own.

■ ■ ■

In September 2014, Ars Technica revealed that Comcast was injecting self-promotional advertising into websites served via its Wi-Fi hotspots.3 Such injections are effectively a man-in-the middle attack,4 creating a situation that had the potential to break a website. As security expert Dan Kaminsky put it this way:

[Y]ou no longer know, as a website developer, precisely what code is running in browsers out there. You didn’t send it, but your customers received it.

Comcast isn’t the only organization that does this. Hotels, airports, and other “free” Wi-Fi providers routinely inject advertising and other code into websites that pass through their networks.

■ ■ ■

Many web designers and developers mistakenly believe that JavaScript support is a given or that issues with JavaScript drifted off with the decline of IE 8, but these three stories are all recent, and none of them concerned a browser support issue. If these stories tell you anything, it’s that you need to develop the 1964 Chrysler Imperial5 of websites—sites that soldier on even when they are getting pummeled from all sides. After all, devices, browsers, plugins, servers, networks, and even the routers that ultimately deliver your sites all have a say in how (and what) content actually gets to your users.

Get Familiar with Potential Issues so You Can Avoid Them

It seems that nearly every other week a new JavaScript framework comes out, touting a new approach that is going to “revolutionize” the way we build websites. Frameworks such as Angular, Ember, Knockout, and React do away with the traditional model of browsers navigating from page to page of server-generated content. Instead, these frameworks completely take over the browser and handle all the requests to the server, usually fetching bits and pieces of content a few at a time to control the whole experience end to end. No more page refreshes. No more waiting.

There’s just one problem: Without JavaScript, nothing happens.

No, I’m not here to tell you that you shouldn’t use JavaScript.6 I think JavaScript is an incredibly useful tool, and I absolutely believe it can make your users’ experiences better…when it’s used wisely.

Understand Your Medium

In the early days of the Web, “proper” software developers shied away from JavaScript. Many viewed it as a “toy” language (and felt similarly about HTML and CSS). It wasn’t as powerful as Java or Perl or C in their minds, so it wasn’t really worth learning. In the intervening years, however, JavaScript has changed a lot.

Many of these developers began paying attention to JavaScript in the mid-2000s when Ajax became popular. But it wasn’t until a few years later that they began bringing their talents to the Web in droves, lured by JavaScript frameworks and their promise of a more traditional development experience for the Web. This, overall, is a good thing—we need more people working on the Web to make it better. The one problem I’ve seen, however, is the fundamental disconnect traditional software developers seem to have with the way deploying code on the Web works.

In traditional software development, you have some say in the execution environment. On the Web, you don’t. I’ll explain. If I’m writing server-side software in Python or Rails or even PHP, one of two things is true:

  • I control the server environment, including the operating system, language versions, and packages.
  • I don’t control the server environment, but I have knowledge of it and can author my program accordingly so it will execute as anticipated.

In the more traditional installed software world, you can similarly control the environment by placing certain restrictions on what operating systems your code supports and what dependencies you might have (such as available hard drive space or RAM). You provide that information up front, and your potential users can choose your software—or a competing product—based on what will work for them.

On the Web, however, all bets are off. The Web is ubiquitous. The Web is messy. And, as much as I might like to control a user’s experience down to the pixel, I understand that it’s never going to happen because that isn’t the way the Web works. The frustration I sometimes feel with my lack of control is also incredibly liberating and pushes me to come up with more creative approaches. Unfortunately, traditional software developers who are relatively new to the Web have not come to terms with this yet. It’s understandable; it took me a few years as well.

You do not control the environment executing your JavaScript code, interpreting your HTML, or applying your CSS. Your users control the device (and, thereby, its processor speed, RAM, etc.). Depending on the device, your users might choose the operating system, browser, and browser version they use. Your users can decide which add-ons they use in the browser. Your users can shrink or enlarge the fonts used to display your site. And the Internet providers sit between you and your users, dictating the network speed, regulating the latency, and ultimately controlling how (and what part of) your content makes it into their browser. All you can do is author a compelling, adaptive experience and then cross your fingers and hope for the best.

The fundamental problem with viewing JavaScript as a given—which these frameworks do—is that it creates the illusion of control. It’s easy to rationalize this perspective when you have access to the latest and greatest hardware and a speedy and stable connection to the Internet. If you never look outside of the bubble of our industry, you might think every one of your users is so well-equipped. Sure, if you are building an internal web app, you might be able to dictate the OS/browser combination for all your users and lock down their machines to prevent them from modifying any settings, but that’s not the reality on the open Web. The fact is that you can’t absolutely rely on the availability of any specific technology when it comes to delivering your website to the world.

It’s critical to craft your website’s experiences to work in any situation by being intentional in how you use specific technologies, such as JavaScript. Take advantage of their benefits while simultaneously understanding that their availability is not guaranteed. That’s progressive enhancement.

The history of the Web is littered with JavaScript disaster stories. That doesn’t mean you shouldn’t use JavaScript or that it’s inherently bad. It simply means you need to be smart about your approach to using it. You need to build robust experiences that allow users to do what they need to do quickly and easily, even if your carefully crafted, incredibly well-designed JavaScript-driven interface can’t run.

Why No JavaScript?

Often the term progressive enhancement is synonymous with “no JavaScript.” If you’ve read this far, I hope you understand that this is only one small part of the puzzle. Millions of the Web’s users have JavaScript. Most browsers support it, and few users ever turn it off. You can—and indeed should—use JavaScript to build amazing, engaging experiences on the Web.

If it’s so ubiquitous, you may well wonder why you should worry about the “no JavaScript” scenario at all. I hope the stories I shared earlier shed some light on that, but if they weren’t enough to convince you that you need a “no JavaScript” strategy, consider this: The U.K.’s GDS (Government Digital Service) ran an experiment to determine how many of its users did not receive JavaScript-based enhancements, and it discovered that number to be 1.1 percent, or 1 in every 93 users.7, 8 For an ecommerce site like Amazon, that’s 1.75 million people a month, which is a huge number.9 But that’s not the interesting bit.

First, a little about GDS’s methodology. It ran the experiment on a high-traffic page that drew from a broad audience, so it was a live sample which was more representative of the true picture, meaning the numbers weren’t skewed by collecting information only from a subsection of its user base. The experiment itself boiled down to three images:

  • A baseline image included via an img element
  • An img contained within a noscript element
  • An image that would be loaded via JavaScript

The noscript element, if you are unfamiliar, is meant to encapsulate content you want displayed when JavaScript is unavailable. It provides a clean way to offer an alternative experience in “no JavaScript” scenarios. When JavaScript is available, the browser ignores the contents of the noscript element entirely.

With this setup in place, the expectation was that all users would get two images. Users who fell into the “no JavaScript” camp would receive images 1 and 2 (the contents of noscript are exposed only when JavaScript is not available or turned off). Users who could use JavaScript would get images 1 and 3.

What GDS hadn’t anticipated, however, was a third group: users who got image 1 but didn’t get either of the other images. In other words, they should have received the JavaScript enhancement (because noscript was not evaluated), but they didn’t (because the JavaScript injection didn’t happen). Perhaps most surprisingly, this was the group that accounted for the vast majority of the “no JavaScript” users—0.9 percent of the users (as compared to 0.2 percent who received image 2).

What could cause something like this to happen? Many things:

  • JavaScript errors introduced by the developers
  • JavaScript errors introduced by in-page third-party code (e.g., ads, sharing widgets, and the like)
  • JavaScript errors introduced by user-controlled browser add-ons
  • JavaScript being blocked by a browser add-on
  • JavaScript being blocked by a firewall or ISP (or modified, as in the earlier Comcast example)
  • A missing or incomplete JavaScript program because of network connectivity issues (the “train goes into a tunnel” scenario)
  • Delayed JavaScript download because of slow network download speed
  • A missing or incomplete JavaScript program because of a CDN outage
  • Not enough RAM to load and execute the JavaScript10
Screenshot of an error message reading, “HTTP Error 413: Request Entity Too Large. The page you requested could not be loaded. Please try loading a different page.”
A BlackBerry device attempting to browse to the Obama for America campaign site in 2012. It ran out of RAM trying to load 4.2MB of HTML, CSS, and JavaScript. Photo credit: Brad Frost

That’s a ton of potential issues that can affect whether a user gets your JavaScript-based experience. I’m not bringing them up to scare you off using JavaScript; I just want to make sure you realize how many factors can affect whether users get it. In truth, most users will get your enhancements. Just don’t put all your eggs in the JavaScript basket. Diversify the ways you deliver your content and experiences. It reduces risk and ensures your site will support the broadest number of users. It pays to hope for the best and plan for the worst.

Footnotes

Interaction Is an Enhancement

Posted by The fine folks at A List Apart |15 Dec 15 |

A note from the editors: We’re pleased to offer this excerpt from Chapter 5 of Aaron Gustafson’s book, Adaptive Web Design, Second Edition. Buy the book from New Riders and get a 35% discount using the code AARON35.

In February 2011, shortly after Gawker Media launched a unified redesign of its various properties (Lifehacker, Gizmodo, Jezebel, etc.), users visiting those sites were greeted by a blank stare. Not a single one displayed any content. What happened? JavaScript happened. Or, more accurately, JavaScript didn’t happen.1

Screenshot of a completely blank website with only the Lifehacker logo displayed.
Lifehacker during the JavaScript incident of 2011.

In architecting its new platform, Gawker Media had embraced JavaScript as the delivery mechanism for its content. It would send a hollow HTML shell to the browser and then load the actual page content via JavaScript. The common wisdom was that this approach would make these sites appear more “app like” and “modern.” But on launch day, a single error in the JavaScript code running the platform brought the system to its knees. That one solitary error caused a lengthy “site outage”—I use that term liberally because the servers were actually still working—for every Gawker property and lost the company countless page views and ad impressions.

It’s worth noting that, in the intervening years, Gawker Media has updated its sites to deliver content in the absence of JavaScript.

■ ■ ■

Late one night in January 2014 the “parental filter” used by Sky Broadband—one of the UK’s largest ISPs (Internet service providers)— began classifying code.jquery.com as a “malware and phishing” website.2 The jQuery CDN (content delivery network) is at that URL. No big deal—jQuery is only the JavaScript library that nearly three-quarters of the world’s top 10,000 websites rely on to make their web pages work.

With the domain so mischaracterized, Sky’s firewall leapt into action and began “protecting” the vast majority of their customers from this “malicious” code. All of a sudden, huge swaths of the Web abruptly stopped working for every Sky Broadband customer who had not specifically opted out of this protection. Any site that relied on CDN’s copy of jQuery to load content, display advertising, or enable interactions was dead in the water—through no fault of their own.

■ ■ ■

In September 2014, Ars Technica revealed that Comcast was injecting self-promotional advertising into websites served via its Wi-Fi hotspots.3 Such injections are effectively a man-in-the middle attack,4 creating a situation that had the potential to break a website. As security expert Dan Kaminsky put it this way:

[Y]ou no longer know, as a website developer, precisely what code is running in browsers out there. You didn’t send it, but your customers received it.

Comcast isn’t the only organization that does this. Hotels, airports, and other “free” Wi-Fi providers routinely inject advertising and other code into websites that pass through their networks.

■ ■ ■

Many web designers and developers mistakenly believe that JavaScript support is a given or that issues with JavaScript drifted off with the decline of IE 8, but these three stories are all recent, and none of them concerned a browser support issue. If these stories tell you anything, it’s that you need to develop the 1964 Chrysler Imperial5 of websites—sites that soldier on even when they are getting pummeled from all sides. After all, devices, browsers, plugins, servers, networks, and even the routers that ultimately deliver your sites all have a say in how (and what) content actually gets to your users.

Get Familiar with Potential Issues so You Can Avoid Them

It seems that nearly every other week a new JavaScript framework comes out, touting a new approach that is going to “revolutionize” the way we build websites. Frameworks such as Angular, Ember, Knockout, and React do away with the traditional model of browsers navigating from page to page of server-generated content. Instead, these frameworks completely take over the browser and handle all the requests to the server, usually fetching bits and pieces of content a few at a time to control the whole experience end to end. No more page refreshes. No more waiting.

There’s just one problem: Without JavaScript, nothing happens.

No, I’m not here to tell you that you shouldn’t use JavaScript.6 I think JavaScript is an incredibly useful tool, and I absolutely believe it can make your users’ experiences better…when it’s used wisely.

Understand Your Medium

In the early days of the Web, “proper” software developers shied away from JavaScript. Many viewed it as a “toy” language (and felt similarly about HTML and CSS). It wasn’t as powerful as Java or Perl or C in their minds, so it wasn’t really worth learning. In the intervening years, however, JavaScript has changed a lot.

Many of these developers began paying attention to JavaScript in the mid-2000s when Ajax became popular. But it wasn’t until a few years later that they began bringing their talents to the Web in droves, lured by JavaScript frameworks and their promise of a more traditional development experience for the Web. This, overall, is a good thing—we need more people working on the Web to make it better. The one problem I’ve seen, however, is the fundamental disconnect traditional software developers seem to have with the way deploying code on the Web works.

In traditional software development, you have some say in the execution environment. On the Web, you don’t. I’ll explain. If I’m writing server-side software in Python or Rails or even PHP, one of two things is true:

  • I control the server environment, including the operating system, language versions, and packages.
  • I don’t control the server environment, but I have knowledge of it and can author my program accordingly so it will execute as anticipated.

In the more traditional installed software world, you can similarly control the environment by placing certain restrictions on what operating systems your code supports and what dependencies you might have (such as available hard drive space or RAM). You provide that information up front, and your potential users can choose your software—or a competing product—based on what will work for them.

On the Web, however, all bets are off. The Web is ubiquitous. The Web is messy. And, as much as I might like to control a user’s experience down to the pixel, I understand that it’s never going to happen because that isn’t the way the Web works. The frustration I sometimes feel with my lack of control is also incredibly liberating and pushes me to come up with more creative approaches. Unfortunately, traditional software developers who are relatively new to the Web have not come to terms with this yet. It’s understandable; it took me a few years as well.

You do not control the environment executing your JavaScript code, interpreting your HTML, or applying your CSS. Your users control the device (and, thereby, its processor speed, RAM, etc.). Depending on the device, your users might choose the operating system, browser, and browser version they use. Your users can decide which add-ons they use in the browser. Your users can shrink or enlarge the fonts used to display your site. And the Internet providers sit between you and your users, dictating the network speed, regulating the latency, and ultimately controlling how (and what part of) your content makes it into their browser. All you can do is author a compelling, adaptive experience and then cross your fingers and hope for the best.

The fundamental problem with viewing JavaScript as a given—which these frameworks do—is that it creates the illusion of control. It’s easy to rationalize this perspective when you have access to the latest and greatest hardware and a speedy and stable connection to the Internet. If you never look outside of the bubble of our industry, you might think every one of your users is so well-equipped. Sure, if you are building an internal web app, you might be able to dictate the OS/browser combination for all your users and lock down their machines to prevent them from modifying any settings, but that’s not the reality on the open Web. The fact is that you can’t absolutely rely on the availability of any specific technology when it comes to delivering your website to the world.

It’s critical to craft your website’s experiences to work in any situation by being intentional in how you use specific technologies, such as JavaScript. Take advantage of their benefits while simultaneously understanding that their availability is not guaranteed. That’s progressive enhancement.

The history of the Web is littered with JavaScript disaster stories. That doesn’t mean you shouldn’t use JavaScript or that it’s inherently bad. It simply means you need to be smart about your approach to using it. You need to build robust experiences that allow users to do what they need to do quickly and easily, even if your carefully crafted, incredibly well-designed JavaScript-driven interface can’t run.

Why No JavaScript?

Often the term progressive enhancement is synonymous with “no JavaScript.” If you’ve read this far, I hope you understand that this is only one small part of the puzzle. Millions of the Web’s users have JavaScript. Most browsers support it, and few users ever turn it off. You can—and indeed should—use JavaScript to build amazing, engaging experiences on the Web.

If it’s so ubiquitous, you may well wonder why you should worry about the “no JavaScript” scenario at all. I hope the stories I shared earlier shed some light on that, but if they weren’t enough to convince you that you need a “no JavaScript” strategy, consider this: The U.K.’s GDS (Government Digital Service) ran an experiment to determine how many of its users did not receive JavaScript-based enhancements, and it discovered that number to be 1.1 percent, or 1 in every 93 users.7, 8 For an ecommerce site like Amazon, that’s 1.75 million people a month, which is a huge number.9 But that’s not the interesting bit.

First, a little about GDS’s methodology. It ran the experiment on a high-traffic page that drew from a broad audience, so it was a live sample which was more representative of the true picture, meaning the numbers weren’t skewed by collecting information only from a subsection of its user base. The experiment itself boiled down to three images:

  • A baseline image included via an img element
  • An img contained within a noscript element
  • An image that would be loaded via JavaScript

The noscript element, if you are unfamiliar, is meant to encapsulate content you want displayed when JavaScript is unavailable. It provides a clean way to offer an alternative experience in “no JavaScript” scenarios. When JavaScript is available, the browser ignores the contents of the noscript element entirely.

With this setup in place, the expectation was that all users would get two images. Users who fell into the “no JavaScript” camp would receive images 1 and 2 (the contents of noscript are exposed only when JavaScript is not available or turned off). Users who could use JavaScript would get images 1 and 3.

What GDS hadn’t anticipated, however, was a third group: users who got image 1 but didn’t get either of the other images. In other words, they should have received the JavaScript enhancement (because noscript was not evaluated), but they didn’t (because the JavaScript injection didn’t happen). Perhaps most surprisingly, this was the group that accounted for the vast majority of the “no JavaScript” users—0.9 percent of the users (as compared to 0.2 percent who received image 2).

What could cause something like this to happen? Many things:

  • JavaScript errors introduced by the developers
  • JavaScript errors introduced by in-page third-party code (e.g., ads, sharing widgets, and the like)
  • JavaScript errors introduced by user-controlled browser add-ons
  • JavaScript being blocked by a browser add-on
  • JavaScript being blocked by a firewall or ISP (or modified, as in the earlier Comcast example)
  • A missing or incomplete JavaScript program because of network connectivity issues (the “train goes into a tunnel” scenario)
  • Delayed JavaScript download because of slow network download speed
  • A missing or incomplete JavaScript program because of a CDN outage
  • Not enough RAM to load and execute the JavaScript10
Screenshot of an error message reading, “HTTP Error 413: Request Entity Too Large. The page you requested could not be loaded. Please try loading a different page.”
A BlackBerry device attempting to browse to the Obama for America campaign site in 2012. It ran out of RAM trying to load 4.2MB of HTML, CSS, and JavaScript. Photo credit: Brad Frost

That’s a ton of potential issues that can affect whether a user gets your JavaScript-based experience. I’m not bringing them up to scare you off using JavaScript; I just want to make sure you realize how many factors can affect whether users get it. In truth, most users will get your enhancements. Just don’t put all your eggs in the JavaScript basket. Diversify the ways you deliver your content and experiences. It reduces risk and ensures your site will support the broadest number of users. It pays to hope for the best and plan for the worst.

Footnotes

This week’s sponsor: Padded Spaces

Posted by The fine folks at A List Apart |15 Dec 15 |

Relax smarter with your devices—wherever you are—with lap desks and bedside caddies from our sponsor, Padded Spaces.

This week’s sponsor: Liquid Web

Posted by The fine folks at A List Apart |09 Dec 15 |

Speed up your Wordpress sites with managed hosting and round the clock support from our sponsor, LiquidWeb.