There is a service I use that occasionally means I have to upload some files somewhere (who it is does not matter, as frankly they are all the same). This is basically a simple case of pointing at a folder on my hard drive and copying the contents onto a remote server, where they probably do some database related stuff to assign that bunch of files a name, and verify who downloads it.
Its a big company, so they have big processes, and probably get hacked lot, so there is some security that is required, and also some verification that the files are not tampered with between me uploading and them receiving them. I get that.
…but basically we are talking about enumerating some files, reading them, uploading them, and then closing the connection with a log file saying if it worked, and if not what went wrong. This is not rocket science, and in fact I’ve written code like this from absolute scratch myself, using the wininet API and php on a server talking to a MySQL database. My stuff was probably not quite that robust compared to enterprise level stuff, but it did support hundreds of thousands of uploaded files (GSB challenge data), and verification and download and logging of them. It was one coder working maybe for 2 or 3 weeks?
The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process.
You might think thats an embarrassing typo, so I’ll be clear. TWO THOUSAND SEVEN HUNDRED FILEs and 237MB of executables and supporting crap, to copy some files from a client to a server. This is beyond bloatware, this is beyond over-engineering, this is absolutely totally and utterly, provably, obviously, demonstrably ridiculous and insane.
The thing is… I suspect this uploader is no different to any other such software these days from any other large company. Oh and BTW it gives error messages and right now, it doesn’t work. sigh.
I’ve seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level,. efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible?
You can write a program that uploads files securely, rapidly, and safely to a server in less than a twentieth of that amount of code. It can be a SINGLE file, just a single little exe. It does not need hundred and hundreds of DLLS. Its not only possible, its easy, and its more reliable, and more efficient, and easier to debug, and…let me labor this point a bit… it will actually work.
Code bloat sounds like something that grumpy old programmers in their fifties (like me) make a big deal out of, because we are grumpy and old and also grumpy. I get that. But us being old and grumpy means complaining when code runs 50% slower than it should, or is 50% too big. This is way, way, way beyond that. We are at the point where I honestly do believe that 99.9% of the code in files on your PC is absolutely useless and is never even fucking executed. Its just there, in a suite of 65 DLLS, all because some coder wanted to do something trivial, like save out a bitmap and had *no idea how easy that is*, so they just imported an entire bucketful of bloatware crap to achieve it.
Like I say, I really should not be annoyed at young programmers doing this. Its what they learned. They have no idea what high performance or constraint-based development is. When you tell them the original game Elite had a sprawling galaxy, space combat in 3D, a career progression system, trading and thousands of planets to explore, and it was 64k, I guess they HEAR you, but they don’t REALLY understand the gap between that, and what we have now.
Why do I care?
I care for a ton of reasons, not least being the fact that if you need two thousand times as much code as usual to achieve a thing, it should work. But more importantly, I am aware of the fact that 99.9% of my processor time on this huge stonking PC is utterly useless. Its carrying out billions of operations per second just to sit still. My PC should be in super-ultra low power mode right now, with all the fans off, in utter silence because all thats happening is some spellchecking as I type in wordpress.
Ha. WordPress.
Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years.
If I’m right and (conservatively), we have 99% wastage on our PCS, we are wasting 99% of the computer energy consumption too. This is beyond criminal. And to do what? I have no idea, but a quick look at task manager on my PC shows a metric fuckton of bloated crap doing god knows what. All I’m doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I’m not running a game right now, I’m using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required.
Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don’t even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.
This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad.
I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend. Soon every one of the inexplicable thousands of ‘programmers’ employed at these places will just be using machine-learning to copy-paste bloated, buggy, sprawling crap from github into their code as they type. A simple attempt to add two numbers together will eventually involve 32 DLLS, 16 windows services and a billion lines of code.
Twitter has two thousand developers. Tweetdeck randomly just fails to load a user column. Its done it for four years now. I bet none of the coders have any idea why it happens, and the code behind it is just a pile of bloated, copy-pasted bullshit.
Reddit, when suggesting a topic title from a link, cannot cope with an ampersand or a semi colon or a pound symbol. Its 2022. They probably have 2,000 developers too. None of them can make a text parser work, clearly. Why are all these people getting paid?
There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.