After working with quite some programming languages, I feel the urge to write down my thoughts on the features they employ and how this affects the programmer's way of thinking. This is an opinion piece, but I know that the large majority of my friends / old classmates / collegues feel the same way when I discussed the topic with them.
With this I mean C89 in particular.
The language encourages it's user to express it's thoughts as direct and precise as possible, while taking full responsibility of the concequences. The language rules are simple, it only provides the minimum you need to get started and the language doesn't hold your hand.
We have learned the computer to think in the way we view the world. It's not a bad thing per-se, as it now allows development tools to include auto completion and if the compiler is good enough, it can translate it properly back to C code (which results in similar performance as hand-written C code). However, most languages (Java, C#, etc) compile the code without translation for the computer to understand it easily, making it tough for the computer to understand.
Now we've given the programmer to dictate to another programmer what he can and can't say.
Exceptions are returned when the programmer get's an error in our code, and the exception type defines the type of error we get. We use exceptions for every error we get (which is an encourage practice in Java, C#, etc).
Since we use exceptions for everything, it's no longer an exception. It's common. That defeats the whole purpose of it being an exception, doesn't it? Also, why replace the proven method of returning an error code which functions properly in C? If you want to return error messages besides returning data, why not have multiple return types like in lua and go ([0] for error messsage, [1] for data returned) instead? There are more elegant solutions that don't have the runtime requirement that exceptions have.
With exceptions, the try-catch
block was introduced to handle exceptions.
Like yoda said: "Do or don't, there is no try". We either succeed or fail at executing the function properly, there is no in-between. Another problem is that it increases the code horizontally, not vertically as well as it's not explicit what you're expecting to happen. It also has the nasty side effect that it leaves data in an corrupt state in certain circumstances.
We now reach the point where we just shout at the computer what we want and leave it fully up to the computer how to interpret it. We have no idea what the computer will do, except that we can make assumptions based on what we shouted at it. We no longer care about the exact data we use, only how we can get the result we are looking for.
Having one place to grab all the additional tools you need is on the fundumental level not an bad idea per-se, but because it made obtaining packages simpler the mantra became to use packages as much as you can. They're called packages and not libraries for a reason; they're small and easily deployable, or so one would think. Packages use each-other, resulting in recursive dependencies. You're now using hundreds of packages without knowing it, because that one package has a dependency on another which has a - you get my point. Now imagine that one of those breaks. That would result in quite the disaster.
A good programmer knows the exact data it needs to handle, explain to the computer concisely how to handle the data you need to manipulate, can take responsibility when handling memory and only rely on someone else's code when absolutely needed. It feels downright insulting that modern languages make the assumption that I can't do anything on my own and dictates what I should say and do.