Another common trick for speeding up JavaScript on a web page is offloading some of it to other sites. Google Analytics and Google Ads are probably the most well-known and ubiquitous. Google also provides the Libraries API, a content distribution network (CDN) for jQuery, Prototype, and other common frameworks, and hosts small projects such as ie7-js. Facebook and Twitter offer their own frameworks for integrating with social media. Then there’s Typekit, which lets you load web fonts from their site, etc.

Since most web browsers limit the number of concurrent requests to any particular host, loading scripts from other sites can get your website to load from your server faster. For some sites, it’s also the best way for them to offer an integration API, with minimal concern for breaking old code (since they control it all).

However, there can be a downside. Loading lots of scripts from other servers makes your website more dependent on those servers, which are outside your control. If any of them are down or inaccessible, it affects the performance and usability of your website as well. I’ve encountered this more often as content offloading has become more common, and its pretty annoying when a page stalls completely for want of some trivial bit of fluff.

What do you think? Does offloading lead to a faster and more usefully integrated web? Or to a house of cards, ready to topple at the first server outage anywhere?