Sir
This article was accurate and explained the problem well to a layperson (I tested it on my spouse).
It missed one important bit - dependencies. All software today depends on other software. To give an example, when you start a project in the most popular language (Javascript) and add a couple of dependencies, you will find that you have transitively pulled in hundreds of other dependencies, written over years by thousands of independent developers.
Each dependency is like a block of legos. We attach them together by the bits at the edge (the "interface") and assume that the bits in the middle work well. Most programmers spend most of their time attaching these disparate blocks together, hoping that everything works well. The main risk mitigation strategy we employ is using dependencies that everyone else is using. The thinking goes that someone else would surely have audited the code or found issues with it. This works .. until it doesn't. A dependency might disappear altogether (How one programmer broke the internet by deleting a tiny piece of code), or it might inject malicious code into our application (Hacker adds malicious bitcoin-stealing code to popular JavaScript library).
Most programmers also lack the knowledge and inclination to go looking for issues in this vast stack. We build skyscrapers of legos, focussed only on the top and only pay attention to our dependencies when the skyscraper starts swaying ominously.
And that's only our explicit dependencies. We also implicitly depend on our operating systems, databases and programming languages to adhere to their spec. If they don't, we don't have much recourse apart from telling the developers of these components and hoping for a fix.
This is not laziness - if every programmer actually tried to understand their entire stack or rewrite it on their own it might take decades, during which time the world would move on. Besides, our next feature is due to be shipped in 14 days.