What's wrong with code in 2022? 🤷🏻‍♀️

Maria 🍦 Marshmallow - Dec 24 '22 - - Dev Community

I use one service from time to time: I need to upload some files there (the name of the service does not matter, because, frankly, they are all the same). Basically, I just point to a folder on my hard disk, after which its contents are copied to a remote server, which is probably doing something related to databases – these files are given names, and checks are made on who downloads them.

The service is owned by a large company, so its processes are large-scale. It is likely to be hacked a lot, so some protection is required, as is checking that no one has modified the files in the interval between uploading from my PC and receiving it on the server. I understand all this.

... but in essence, we are talking about the fact that you need to register several files, read them, upload, and then close the connection and write to the log file whether everything went well. And if not, what exactly happened. There is nothing complicated about this, and I wrote similar code from scratch using the Wininet API and PHP on a server connected to my MySQL database. Perhaps my system was not as reliable as enterprise-level systems, but it supported hundreds of thousands of uploaded files, their verification, downloading, and logging. That's a job for one coder for two or three weeks, isn't it?

The special file upload tool I use today has a total of 230 MB of client files and uses 2,700 files to manage this process.

You might think that this is a typo, but there is no mistake: two thousand seven hundred files and 237 MB of executable and supporting files to copy several files from the client to the server. This is no longer bloatware or overengineering, but absolute, obvious, visual madness.

The problem is that, most likely, this uploader is no different from any similar modern software created by any other large company. And by the way, it gives error messages and does not work at the moment.

I have seen coders doing this. I know how it goes. This is not only because coders don't write low-level efficient code to achieve their goal: they've just never seen low-level efficient, well-written code. How can we expect them to create something better if they don't even realize it's possible?

You can write a program that uploads files securely, quickly, and securely to a server, and it takes a twelfth of that amount of code. It could be just one file, a single small .exe. It doesn't need hundreds of DLLs. Not only is it possible, but it's easy, and it's more reliable, efficient, and convenient to debug, and it actually works.

You might think that old programmers in their fifties (like my father) complain about bloated code because they are obsolete and grouchy. And I realize it. But the obsolete and grouchy complain about code that is 50% slower than it should be, or code that is 50% larger than it should be. However, the situation has gone far beyond this. We've reached a point where I sincerely believe that 99.9% of the code in the files on our PCs is completely useless and never gets executed. The code is just in a package of 65 DLLs, simply because the coder wanted to do something trivial like store a bitmap and had no idea how easy it could be, so he imported a whole bunch of bloatware to solve the problem.

Like I said, I really shouldn't get mad at young programmers for this. That is how they were taught. They have no idea what high performance or development with constraints is. It may seem strange that a girl of 25 is talking about this, but I had enough wise mentors to show me really beautiful code. My father told me that the first Elite in 1984 had a huge galaxy, 3D space combat, a career progression system, trading, and thousands of planets to explore, and at the same time the game was 64 KB. Modern programmers may hear this, but they don't realize the gulf between it and what we have today.

Image description

Why is this important to me?

This worries me for a lot of reasons, not least because if you need two thousand times more code to complete a task, then at least it should work. But more importantly, I realize that 99.9% of the CPU time on my huge, powerful PC is completely useless. Today, computers are so fast that, 10 years ago, they would have been regarded as absolute magic. Anything you can imagine should happen in 1/60th of a second. However, when I press the Microsoft Surface laptop's volume icon, I see a delay: the machine gradually creates a new user interface element, figures out which icons to draw, and then they appear and become interactive. This takes time. It seems to be about half a second, which is close to a billion years on a processor’s time scale.

If right now (conservatively) 99% of our PC's resources are wasted, we're wasting 99% of the computer's energy. This is absolutely criminal. And what are these expenses for? If you look in the task manager, you'll notice a bloated software nonsense that does god knows what. I'm just typing this blog post. Windows has 102 background processes running. My NVIDIA graphics card currently owns six of them, and some of them have subtasks. To do what? I'm not playing a game; I'm using almost the same set of video card driver functions that my father did twenty years ago, but for some reason six processes are required.

Web View Microsoft Edge also needs 6 processes, just like Microsoft Edge itself. And I don't even use Microsoft Edge. It seems like I was opening an SVG file yesterday, and here you go - 12 useless pieces of code are wasting memory and probably also polling the CPU.

This is absolute madness. This is the reason why nothing works, everything is slow, you have to buy a new smartphone every year and a new TV to download these bloated streaming apps that also hide equally bad code.

Personally, I think it's only going to get worse because big tech companies like Facebook, Twitter, Reddit, etc. are the worst examples of this trend. Soon, each of the thousands of "programmers" working for these companies will be using machine learning to copy-paste bloated, buggy, sprawling Github stuff into their code. Just to add two numbers, they’ll need 32 DLLs, 16 Windows Services, and a billion lines of code.

Twitter has 2,000 developers. More precisely, it was until Elon Musk came along. Tweetdeck sometimes refuses to load the user column. This has been going on for 4 years now. I'm sure none of the coders have any idea why this is happening. And the code at its core, as my dad says, is just a bunch of bloated, copy-pasted ****.

When suggesting the topic title from a link, Reddit can't handle the ampersand, semicolon, and pound symbol. Outside, the year is 2022. The company probably also has 2,000 developers. Obviously, none of them is able to get the text parser to work correctly. What are all these people doing?

Image description

Once upon a time, there was a “golden age” of programming when there were limits on memory and CPU. Today, we live in an ultra-wasteful pit of inefficiency. This is very sad.

Thanks for reading! I hope you have found my reflections interesting and that you now have some questions to consider. Feel free to leave any comments and write if you agree with my opinion.

. . . . . . . . . . . . .
Terabox Video Player