Skip to content

Instantly share code, notes, and snippets.

@rmurphey
Created July 10, 2012 21:23
  • Star 31 You must be signed in to star a gist
  • Fork 11 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save rmurphey/3086328 to your computer and use it in GitHub Desktop.
What's wrong with Netmag's "Optimize your JavaScript" post

What's wrong with Netmag's "Optimize your JavaScript" post

Update: The original post on Netmag has been updated since this was written.

I tweeted earlier that this should be retracted. Generally, these performance-related articles are essentially little more than linkbait -- there are perhaps an infinite number of things you should do to improve a page's performance before worrying about the purported perf hit of multiplication vs. division -- but this post went further than most in this genre: it offered patently inaccurate and misleading advice.

Here are a few examples, assembled by some people who actually know what they're talking about (largely Rick Waldron and Ben Alman, with some help from myself and several others from the place that shall be unnamed).

Things that are just plain wrong

  • Calling array.push() five times in a row will never be a "performance improvement." The author has clearly confused creating an array literal ["foo", "bar", "baz"] and then using .join("") on it vs. creating an array, pushing individual items, and then joining. See here for the proof, and see here for a possible explanation.

  • The author sets up a for loop as follows: for(var i = 0; length = 999; i <= length; i++){. This results in a syntax error.

  • The author suggests for(var i = my_array.length; i--) as a shorter version of a for loop. While you can get by with using the decrement as the conditional, omitting the semi-colon at the end causes a syntax error. Also, if someone were to move the semi colon to before the decrement, it would cause an infinite loop. Also, if you were ever to do this style of cleverness, a while loop looks much more sane: var i = my_array.length; while( i-- ) {}.

  • Because JavaScript lacks block scope, variables declared inside blocks are not local to any block. The variable declaration is actually "hoisted" to the beginning of the nearest execution context (function body or global scope) such that var foo; for(...) { foo = 1; } behaves exactly the same as for(...) { var foo = 1; }. It is, however, considered bad practice to declare variables inside of blocks, because novice JavaScript developers infer from it that JavaScript has block scope.

  • Creating a variable to hold the length property in a for loop is typically no faster (and sometimes slower) in Chrome. Making it faster in Firefox doesn't make it magically "faster" everywhere.

Things that demonstrate a lack of actual expertise in the subject matter

  • The article mentions DOM perf almost as an afterthought. When it comes to perf, rarely is JavaScript the problem -- DOM reflows are a much more likely culprit for performance issues, but even just modifying elements via the DOM API in memory is still slow. This is one of many reasons people use client-side templates, which the author does not mention. All of that said, if you're looking for real performance gains, take a long, hard look at your HTTP overhead. Chances are that optimizing an image or two will make more difference than any improvements you can make to your JavaScript.

  • The author talks about avoiding unnecessary method calls within conditionals, and then demonstrates by accessing a property, rather than calling a method.

  • The author talks about local vs global variables, and then proceeds to demonstrate the concept with an instance property, not a local variable.

  • The author uses the global Array constructor function to create a new array, rather than using the Array literal [] syntax which is, itself, known to be a performance improvement.

  • The section "Calling methods or properties on array elements" is true, but somewhat misses the point: lookups of any sort should be stored in a variable, rather than being repeated. This guidance has nothing to do with array elements; it's just as true for the result of a function call.

  • In the "encapsulating methods within class declarations" example, the author fails to point out that there are really 2 performance metrics to be tested: 1) cost while creating a new instance object and 2) cost while dereferencing a method from an instance object. The provided example only discusses #1 without addressing #2. What's interesting is that, typically, #1 and #2 are mutually exclusive. When methods are defined directly on an instance object, they consume more memory and cycles (because a new function object must be created for each instance object). When methods are defined on the prototype however, they must only be defined once, so the initial overhead is less, but each time the method is dereferenced from the instance object, the cost is slightly higher because the prototype chain must be traversed. A compromise can be had whereby a previously-defined function is defined as a method on each individual instance, thus reducing memory usage, but the cost of initially assigning the method probably won't offset the cost of accessing the method via the prototype down the road.

  • The "encapsulating methods within class declarations" section also lacks an understanding of semantic, language level behaviour and use cases for instance properties and methods, and prototype properties and methods. Comparing instance properties and methods with prototype properties and methods is like comparing a knife with a stealth fighter jet -- yes they can both be lethal, but one is far more efficient and suited to larger scale tasks and the other is suited to individual, "case by case" (ie. instance) tasks. Additionally, any time that data properties are defined on a prototype, it's likely a mistake that will result in bugs.

  • The section under the heading "Unnecessary method calls within conditionals in For Loops" gives an egregiously absurd example that itself misses the point that should've been made: operations in loops have costs, be mindful of this. In addition to this mistake, there is actually no such thing as a "conditional in For Loops"; the author is simply unaware that the actual name is "Expression", as in... IterationStatement, optional Expression.

  • The author advises using Firebug for profiling code, but provides absolutely zero guidance on how to do this, or even a link to any guidance. This article offers step-by-step guidance on using the Chrome developer tools to identify performance bottlenecks -- which, as we mentioned, you should only do after you've done a whole lot of other things you can do to improve your page's performance. The YSlow page offers a good list of things to do to improve page performance.

In Summary

Smart people spent their time correcting this article, and this sucks. When misleading and inaccurate content gets published by a prominent site, it falls to a busy community to use its time -- a scarce resource -- to correct said content. Publications that distribute such content should be embarassed, and while authors who create it should perhaps be commended for their efforts, they should strongly consider enlisting a knowledgeable technical reviewer in the future. Goodness knows the publication won't cover their ass for them.

Want to know more? See some debunktastic perf tests :)

@rmurphey
Copy link
Author

@JamieMason I ask again: can you please take the time to create a fork of this gist and adjust it to how you think it should have been stated? I appreciate the "be nice" advice, but I'd like to see how that would change this post. More to the point, I'd like to see people critical of this post -- which took a non-trivial amount of time to write -- spend a bit of time of their own to make it better, rather than just writing comments about how it should have been better.

@JamieMason
Copy link

@rmurphey and good for you absolutely. But it seems silly if I'm honest to fork an article just for me to be able to justify promoting positive discourse. And you don't need me to, I've seen you speak - you seem really nice! I just asked for some consideration for the guys feelings - is that so bad?! :)

@addyosmani
Copy link

If I were the author, as much as it might hurt to be publicly corrected for an article containing a number of inaccuracies, I would be extremely grateful for Rebecca, Ben and Rick taking their time to point out the errors in my post. I disagree with those that say the tone of this response could have been friendlier or nicer. It's needed to hit home an important point (one which I learned over time).

When you're writing about a topic, whether it's on your personal blog or for a popular publication, you have the potential to influence how many other people interpret that topic for the long term. Imagine if even a few hundred developers began following the initial (uncorrected) advice and used them in production-level systems. Those perceived optimizations could easily have negative consequences.

Every single developer writing material that's going to be published should have at least one other person with more experience than they have look over it.I've written for Netmag and many others and I don't care if the publication doesn't offer technical reviews. Yes, they should, but as a teacher, it's your responsibility to make sure you aren't spreading misinformation (I say this having made this mistake in the past myself). It might mean your article is delayed for a week or two, but it is so much better getting a handfull of people to look at your content beforehand.

At minimum they'll spot the errors Rebecca pointed out in her response and at most, they may even have further input to offer which could improve the article.

NetMag: I would recommend following the same model Smashing Magazine do. They don't always get it right but they have a panel of technical reviewers (including myself, Lea Verou and others) who read through articles and offer corrections and guidance at least a week before it goes live. That might help lower the number of articles with technically inaccurate information in them going out.

@rmurphey
Copy link
Author

@JamieMason My reading of this is that you wish that I'd taken the time to make this nicer, but you aren't willing to take the same time. So, that's frustrating to me, but perhaps it means you understand better why I'm not inclined to take the (additional) time either.

@JamieMason
Copy link

Ok so I feel like I've offended you and that really wasn't my intention, I just wanted to say "I agree, but think about the guy's feelings" and leave it at that (I care about the community, not the gist).

@addyosmani made some valid points about the importance of articles being correct so I concede, and apologise if you were offended.

@owensd
Copy link

owensd commented Jul 11, 2012 via email

@danheberden
Copy link

I'm still waiting to see the nice fork of this from those that feel being succinct and to the point is not nice and agressive - an example would help to see what it is y'all expected.

@rmurphey
Copy link
Author

@Codery94 I appreciate your feedback, and I will ask you just like I asked above: please, please, seriously please, fork this gist and show me how it should have been written instead.

@rmurphey
Copy link
Author

@codery94 thank you. Note though that one of your examples is from a comment, and not from the actual article.

@danheberden
Copy link

@AdamBowman - so the fact that people that have jobs teaching JavaScript professionally to other professionals, consulting for large companies or otherwise shaping the future of JavaScript and the web took their personal time to do a tech review of an article that a FOR PROFIT company published for the sake of helping less-knowledgeable people not get confused - is beyond arrogant? Seriously?

@lennym
Copy link

lennym commented Jul 11, 2012

I forked it to remove some of what I saw as the harsher language - https://gist.github.com/3091497

I don't have a problem with tearing NetMag a new one - they're a for-profit publication selling misinformation to the community; they should be better than this - but I agree that there's no need to be quite so mean to the author themselves.

@ktiedt
Copy link

ktiedt commented Jul 11, 2012

@Codery94, "We all learnt somewhere and we all made mistakes at some point." is true, however I would hazard a guess that the majority of our mistakes did not get published as facts and good practices on a major website for all to learn our mistakes (note: not learn from them...).

@davemo
Copy link

davemo commented Jul 11, 2012

As requested, I've posted a completely updated fork that removes the derisive language and maintains what I believe to be @rmurphey's desired effect: an informative look at some corrections in an effort to prevent the spread of inaccurate information. I think the advice that was given to potential authors about having community leaders proofread technical articles is a great suggestion. I would also encourage you to seek the same guidance from others when authoring posts like this to avoid unnecessarily painting yourselves in a negative light.

As a final note, thanks for putting your own time and effort into this message. The heart was good, it just needed some proof reading to put it in the right direction :)

Cheers,

https://gist.github.com/3093034

Edit: Also, I would be more than happy to spend some of my own time do some "tone proofreading" on any efforts like this in the future. Shoot me an email or hit me up on twitter!

@justinbmeyer
Copy link

I'm too lazy to confirm, but I'm pretty sure calling push 5 times in a row with a large string will be faster in IE6 and possibly 7. We changed EJS to do this 3 years ago and it was much faster. The reason I believe was that IE would do unnecessarily expensive string copying every time += was encountered.

And, if chrome / ff / opera / safari are "fast enough" we optimize for IE.

@MattRogish
Copy link

IE6 and IE7 are dead. Any optimizations for them are a complete waste of time. You might as well optimize for IE4 on Mac.

http://www.ie6countdown.com/
http://theie7countdown.com/

@danheberden
Copy link

@justinbmeyer faster than ['string1', 'string2', 'string3', 'string4'].join('')? Also, why would you optimize for <15% browser market share?

@justinbmeyer
Copy link

@danheberden ... I was only talking about when you can't know what the final string should look like immediately. If you can do:

['string1', 'string2', 'string3', 'string4'].join('')

why not just do

'string1string2string3string4'

and get the best performance possible. :-)

@MattRogish + @danheberden

Also, why would you optimize for <15% browser market share?
IE6 and IE7 are dead.

This was 3 years ago. I no longer develop much for IE6 thankfully. Yet, a lot of big companies are still on IE6+7. And if you're a big company, 5% more potential revenue is still something to target.

Push at some point was faster. And it still seems slighly faster in FF http://jsperf.com/are-you-kidding-me/4. Labeling using push as "just plain wrong" isn't entirely accurate. It was right, and maybe if you are doing FF-only work it's still right. If you are having to support IE6 (and maybe 7) it still might be better to use push in some places simply because how terrible += is.

This all gets into a gray area.

@danheberden
Copy link

@justinbmeyer - i've seen a lot of times people use the array format to make long strings fit well on the screen. As for < IE8, you'd want to do arr[ arr.length ] = "moar strings" instead of arr.push( "moar strings" ) - so in the case of boasting .push() as THE best way, that statement is indeed "just plain wrong". But I'd wager 5% of your customers having a slower experience on their shitty browser that they have a shitty experience every where else with is tolerable.

However, while your points are valid in targeting browsers, this article wasn't called "performance tips for your IE6/7 targeted app".

@justinbmeyer
Copy link

@danheberden

They wouldn't have a terrible experience anywhere else. If something has good performance on IE, it looks amazing in Chrome.

"performance tips for your IE6/7 targeted app".

This post could still include these caveats. arr[ arr.length ] = "moar strings" should probably be part of the critique. But, it depends on how you define the "best way" across multiple browsers. If pressed, I'd use amdahl's law factoring in browser share, but favor the future. Consider two different ways:

way 1

  • IE: 5 seconds
  • Chrome: 0.4 s

way 2

  • IE: 1s
  • Chrome: 0.5 s

If IE has 10% market share vs Chromes 90%. You could calculate total slowness:

Way 1: (5 * .1 = .5) + (.4 * .9 = .36) = .86
Way 2: (1 * .1 = .1) + (.5 * .9 = .45) = .55

Using amdahl, Way 2 would be faster for "humanity". But, it might not matter because you could care less about the savages using IE. For us, 3 years ago, this equation seemed to favor push (or arr[arr.length]). To prove += is better than .push (or even arr.length) it might be better to include browser share info. Maybe as an improvement to jsperf.

@cristobal
Copy link

String concatenation is generally slow in most languages, if you know how the internals work.
Say:

var a = "";
for (var i = 0; i < 10000; i++) {
    a += "Some random String…";
}

This requires that the string on the left side is expanded everytime the string on the right side is appended by concatenation operation; meaning that the length of the string on the right side is evaluated before expanding the string on the left side and appending the value of string on the right.

Instead when joining an array of strings which is basically pointers to a set of strings, it will evaluate the total length for the final string and join all the strings in the array in one operation/method(depends on the programming language) which is must faster.

Still if you are working with large sections of strings due to manipulation of html or text content why re-invent the wheel there are enough libraries out there like jquery and other for WSIWYG, that already have taken care of most performance and cross browser issues…

@justinbmeyer
Copy link

@cristobal - if you are writing something like http://embededjs.com or another template engine that does a lot of string concat, it's really important.

@cristobal
Copy link

Sure @justinbmeyer but the audience of the article was mainly related to web developers not people developing template engines or http://embeddedjs.com/ like you say. I assume that you already know how stuff works in the background, if was just pointing out why its slow.

Still if you are writing an template engine or doing i.e. Huffman encoding or general encoding of some stream in javascript(java or c++) you would avoid string concatenation since it's a costly(generally slow) operation in most languages not just javascript…

Yes the last two articles are java but string handling is mostly the same. The first article also points out "Ropes" the last article as an alternative for String building in javascript.
There is even an implementation for node https://github.com/josephg/jumprope, still it also states you need to work with strings longer than 5000 chars to see any performance boost.

Poor performance of string concatenation like the one in IE has todo with the javascript engine implementation of IE, which is slow compared to Webkit or Firefox. Hopefully IE10 will improve stuff a bit…

@tanepiper
Copy link

If only we can do the same now with W3CSchool - there is a lot of rubbish on there that can cause FUD in developers

@joeframbach
Copy link

@tanepiper that does exist -- w3fools.com

@tanepiper
Copy link

@joeframbach - true but W3CSchool content still shows up high in search rankings, which is so sad

@oliverlindberg
Copy link

[Not a big fan of doing this publicly, but here we go. This is the email I sent @rmurphey last night:]

Thanks for your feedback on Joe Angus' JavaScript article we posted yesterday.

I'd like to take this opportunity and point out that, while the use of some techniques may be subjective, we of course always want to avoid factual errors and are very open to correcting things. As you may have noticed Joe has now updated the article:

http://www.netmagazine.com/tutorials/optimise-your-javascript

I would have been delighted had you got in touch with me (or directly with Joe) last night before you posted a public response on Github. We do respond to criticism (and we try to move very fast to get an update live), but I fear that if it's handled so publicly, it will put younger developers off to share their tips with the community. I've actually seen this happen quite a few times lately, and it's a real shame. By the looks of some comments on Twitter and your Gist post, I'm not alone with this opinion.

We're also currently looking for experts to help us peer review articles before they're published (both in print and online). In fact, this was one of the first things I started work on when I took the helm at .net a couple of months ago. However, as you know well-regarded experts who are very active in the community and often speak at conferences, like you, are unfortunately yet understandably often too busy to write tutorials themselves or help out with peer reviews.

We always approach people of your knowledge and standing in the community to work with us. Sadly, our previous experience in working with such experts has not always been the best. It's okay to say 'no' due to work pressures, but we also had a fair few occasions, where somebody agreed to write an article for us and then did not deliver and went quiet on us (Rick, I think you know what I'm talking about :)). As you can imagine that's leaving us in the lurch, especially when it comes to print deadilnes.

This means that we have to approach people lower down in the food chain who may not have the same knowledge, yet (or they approach us, as Joe did). It also means that, while technical accuracy is very important to us, errors will creep in from time to time, if we can't get people knowledgable enough to review our articles. That .net has a 'reckless disregard for accuracy' is strictly not true.

So, as a solution, I'd like to invite you to sit on our panel of peer reviewers. We've already signed up Stephanie Rieger, Christian Heilmann and Chris Mills. We're still working out the details but of course you'd at least get a credit on the article you have reviewed.

Anyway, let me know what you think. We'd love you to contribute to .net in whatever form. We're always learning and it's important to us to make your knowledge accessible to the wider community.

Cheers,

Oliver
Editor
.net

@rmurphey
Copy link
Author

[OK, well, if we're going here, here's my reply.]

I appreciate you getting back to us so quickly, I appreciate that the
article has been updated, and I also appreciate that it would have
been more comfortable for you if this entire exchange had occurred
privately. I'm going to speak very frankly here: I feel it was very
important for it to occur publicly, as the web developer community
needs to know that content in your publication must be read with a
shaker full of salt. Handling this privately would have addressed this
single article, but would not have addressed the more systemic problem
of content being published and promoted without regard for its
accuracy.

I am well aware that there could be chilling effects associated with
calling out bad content, and I gave much thought to that fact before
publishing this. The fact is, I want people to second-guess
themselves before publishing content to such a large audience; if they
are unsure, then they should gain experience in lower-stakes
environments, and solicit feedback from people who know the subject
well. Being inexperienced is not a license to make stuff up.

As I've said repeatedly at this point, had Joe published this on his
personal blog, then a private conversation would have been the most
likely -- and most appropriate -- response. However, when it was
published on a well known site with a large audience, and promoted to
30,000 followers, the need to address this publicly was much more
clear. As I said here
https://gist.github.com/3086328#gistcomment-368792, these sorts of
egregiously inaccurate articles have a real impact on folks like us
who already spend a significant amount of time -- without compensation
-- assisting newcomers to web development.

If you are having difficulty identifying qualified authors and
reviewers, I would suggest the answer is one of economics: pay
attractive rates for good content and good review, and you will get
good content and good review. A subject-matter expert could reasonably
expect $500 to $1000 in exchange for the time required to write an
article for a for-profit publication; a reviewer could reasonably
expect $100-$200 for the time required to review it. Since few
publications are willing to pay these rates, few subject matter
experts are willing to provide these services, opting instead to
publish in a place where they can derive other benefits from their
efforts (and where they have control over the quality of the
surrounding content).

I do not know what your compensation structure is, but publications
that wish to avoid incidents such as the one that unfolded yesterday
do well to appreciate these economic considerations. It may well be
that the only viable business model requires that content be from
sub-par sources, because quality content is simply too expensive
relative to the money that can be extracted from it, and few subject
matter experts are willing to provide their time at a discount to a
for-profit entity. If a publication continues to distribute content
even when they can't reasonably ensure that the content is good, then
that publication should expect that the community will continue to
point out that content's flaws, and I don't see any reason to expect
the community to do so privately when the content in question is
public.

[At this point Oliver wrote back, saying that they didn't have trouble identifying quality authors, but they did have issues with follow-through by a few prominent authors. This was my reply.]

You may not have difficulty identifying quality authors, but you
clearly have issues with them following through. As someone who worked
on the night copy desk at a newspaper for five years, I take deadlines
very seriously, and it doesn't sit well with me when people miss them.
I'm sorry that has happened to you. However, I can also understand how
it might happen. While I can't speak to any individual case,
considering the billable rate of these folks, spending hours writing
quality content for less than $500 -- or hours reviewing an article
for less than $100 -- may not be at the top of their to-do list. That
may be the going rate, but it's not a rate that's going to incentivize
people who can easily bill upwards of $2,000 a day.

What I'm struggling with here is that you seem to expect community
members to work at discounted rates in order to ensure you have
quality content, and that when those community members aren't willing
to do that, your answer is to publish anyway. "Optimise your
JavaScript" never should have seen the light of day in its original
state. Just like being inexperienced isn't an excuse for making things
up, having difficulty getting quality content isn't a justification
for publishing things that are just plain wrong.

@ajpiano
Copy link

ajpiano commented Jul 12, 2012

@oliverlindberg It's unclear to me why the answer to making sure that the content in a for-profit publication backed by a publicly traded company is properly vetted and timely is something that would fall to an ad-hoc, outside panel of experts who are essentially receiving a token honorarium at irregular intervals, rather than being a staff position.

@rwaldron
Copy link

Since when does computer science involve the sparing of feelings over document accuracy? These folks would have no problem writing and publishing documents that debunked, refuted and rebuked information that was incorrect, inaccurate and/or provably wrong.

@shodanuk
Copy link

I'm struggling to see anything in Rebecca's post that could be construed as hurtful or personally offensive. People are rightly pointing out that as a profit making organisation, .Net mag have a responsibility to ensure the accuracy of their articles but following through that train of thought, presumably the author of the original article also got paid and shares that responsibility to some extent? If an author is happy to put out an article without taking the personal responsibility to check facts and furthermore, they're happy to take a pay cheque for doing so, as far as I'm concerned they're putting themself in the firing line. Frankly, if someone's feelings are that easily hurt, they might want to avoid publishing anything else on the internet, because rightly or wrongly, the internet can be cruel and it will hurt your feelings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment