Skip to content

Instantly share code, notes, and snippets.

View Morendil's full-sized avatar

Laurent Bossavit Morendil

View GitHub Profile

Note: this content is reposted from my old Google Plus blog, which disappeared when Google took Plus down. It was originally published on 2016-05-16. My views and the way I express them may have evolved in the meantime, and I have likely revisited this investigation in a somewhat more rigorous manner since.

This is just how embarrassed I am for my entire profession

So here I was idly looking at Twitter, when Scott Nickell innocently poked me, regarding one more instance of the old "cost of defects" chestnut:

"I can't tell if this "Systems Sciences Institute at IBM" thing is a new study, or just the same-old."

I was feeling lazy, so I encouraged Scott to apply the usual Leprechaun hunting process: "Here's how you could tell: Google exact phrase for a portion of the article citing it, then note the publication dates of hits." ([Try it](https://www.google.com/search?q=%22cost+to+fix+an+er

Note: this content is reposted from my old Google Plus blog, which disappeared when Google took Plus down. It was originally published on 2016-05-18. My views and the way I express them may have evolved in the meantime. If you like this gist, though, take a look at Leprechauns of Software Engineering. (I have edited minor parts of this post for accuracy after having a few mistakes pointed out in the comments.)

Degrees of intellectual dishonesty

In the previous post, I said something along the lines of wanting to crawl into a hole when I encounter bullshit masquerading as empirical support for a claim, such as "defects cost more to fix the later you fix them".

It's a fair question to wonder why I should feel shame for my profession. It's a fair question who I feel ashamed for. So let's drill a little deeper, and dig into cases.

Before we do that, a disclaimer: I am not in the habit of judging people. In what follows, I only mean to condemn behaviours. Also, I gath

This is an old piece I wrote a few years ago (late 2012), which I don't think I ever published. I couldn't be bothered to write up a conclusion, I think - it was pretty obvious anyway what to conclude.

Research in the age of sampling - dissecting the Frankenpaper

Around Halloween I came across a paper titled "Cost Effective Software Test Metrics", by Lazic et al. It appears to have been published in "WSEAS Transactions on Computers", in June 2008.

WSEAS is an organization whose academic standing is a little difficult to investigate; searching for its name will lead you to a number of blogs, for instance, that vigorously denounce other organizations for creating bogus conferences, sending spam, or defrauding researchers. The Shakespearean phrase "the lady doth protest too much" comes to mind.

Plagiarism in the Google age

@Morendil
Morendil / nist-study.md
Last active January 10, 2024 12:04
Can we bury the NIST study once and for all now?

(N.B. This is a blog post I wrote on Google+ in 2014, which had since disappeared from the Web.)

Can we bury the NIST study once and for all now?

The NIST study concluded that "the impact of inadequate software testing infrastructure on the US economy was between 22.2 and 59.5 billion dollars".

As usual, people mention this figure as if it was an undisputed fact (for instance, you can find it on a couple Wikipedia pages). It's a good bet that they haven't read the original document carefully and critically. If they had, they might have noticed some red flags in the "study" and would at the very least hedge by emphasizing that it is an estimate.

There are two important aspects to any estimate: precision and accuracy.