Dependence on computers

I started writing about something in the comment section, but I decided it’s relevant enough to make it an article.

The CrowdStrike event looks like a very mild example of something I’ve been worrying about for years, namely a widespread systemic persistent IT outage that puts payment systems worldwide out of commission.

Basically, everybody is using digital payment for everything these days, so what happens if it all goes out for some reason? Oh, you’ll use cash. You mean, the ATM is going to work? No it isn’t. You mean, you have cash and will just use it? You mean, the cash register computer will not be afflicted, and the cashier will be willing to take your money without the ability to print out the invoice and register the transaction? Or will all the stores close until this is dealt with? In which case you will have to rely on whatever food and hygienic/medical supplies you have at your place, because you’ve been prepping? Oh wait, you’ve been prepping but since nothing happened you just consumed all the stuff and there isn’t any now? Yeah, that.

I mean, the first level of preparing for an IT outage is to have an air-gapped spare laptop stashed in some drawer, with Linux/Windows dual boot in case one of those two is the cause of failure, but the next question is, what do you connect to, if the cause of the problem is general, so the telecoms are down, banks are down, online services are down, AWS/Azure can’t process your credit card so it locks you out of your servers, GoDaddy is down so you can’t transfer your domains somewhere out of the afflicted area, or DNS is down so you can’t reach anything, or the satellites are down so Starlink doesn’t work. And let’s say it’s something really major so the consequences take so long to clear, there’s serious breakdown of services everywhere.

The first answer everybody has to this is something along the lines of “it’s unlikely that all the computer systems will go out at once”. True, it’s unlikely, but it was also previously unseen that all the enterprise win10 machines go out at once and half the world gets instantly paralyzed. Those machines aren’t independent. Microsoft enforces push updates, and the big corporations have unified IT policies which means they all enforce updates to all their machines. Also, everybody seems to run Windows, which means it’s no longer necessary for an attack vector or a blunder to target billions of computers independently, because it’s a single failure that can propagate from a single point and instantly take down enough of the network that the rest have nothing to connect to.

Also, there have recently been revelations that OpenSSL had severe vulnerabilities. The vast majority of Internet infrastructure uses OpenSSL. A systemic vulnerability that can be targeted everywhere means… you tell me.

Someone will say that people would adapt, and my answer is, what does that even mean? Every single store I’ve been in for the last decade or so uses bar-code readers to scan items, and then the computer pulls out the item data, most notably the price, from the database, so that the cashier can charge you. More recently, all those computers are required to connect to the state tax service where every bill needs to be “fiscalised” for taxation purposes. If Internet fails, the cash register can’t “fiscalise” bills and that’s going to be a problem. If the cash register is out because it’s always a Windows machine and you saw what can happen to those, and it’s connected to the Internet or the “fiscalisation” won’t work, the cashier won’t be able to tell how much the item you want to purchase costs and thus won’t be able to charge you. They don’t have prices on items anymore, like they did in the ‘80s. Everything is in the database.

Some say, run Linux, or buy a Mac. Great, but it doesn’t actually solve anything, because if every Enterprise and most smaller companies run everything on Windows, and those computers all bluescreen, what are you going to connect to, with your Linux PC? How does your computer even matter if you go to a store and you can’t buy anything, and how does it matter if you try to go online and most of everything is down, because OpenSSL has been attacked by something that gets root permissions on your computer and encrypts its filesystem?

I’ve been recently thinking that Internet isn’t so much a framework for connecting computers, but really a separate plane of existence. When I’m using my computer, I’m not really on an island in Croatia, I’m on the Internet. Imagine all the beings that exist in the physical world, but without an Internet connection, like trees, birds, cats and so on. In order to interact with them or even perceive them, you need to switch planes of existence, between physical world and the Internet. However, some aspects of the physical world, like our civilization for instance, have been abstracted into the Internet to such a degree that you can’t even use them anymore if you don’t have access to all kinds of Internet-based infrastructure, which is not currently perceived as a problem, but might become one really fast if something fundamental breaks down with the Internet.

Also, if a nefarious government or a corporation wants to lock you out of the Internet for “non-compliance”, you are really fucked, which makes it a really big sword of Damocles hanging over our heads, forcing everybody to be good and obedient slaves.

Intel SNAFU

Regarding the Intel CPU issues, I must say I expected that; I couldn’t tell which manufacturer will have the issue first, but with the arms race of who’ll be the first to make the 0 nm node dye that draws a megawatt of power and needs to be cooled with liquid Helium, it’s a perfectly logical outcome. If I had to guess, I’d say they made transistors out of too little material at some critical part of the dye, and with thermal migration within the material at operational temperatures the transistors basically stopped working properly.

So, basically, the real question is who actually needs a CPU of that power on a single-CPU client machine? We’re talking 24 cores, 32 threads, 5.8 GHz boost clock, 219W max. power at full turbo boost. This is sheer insanity, and it’s obvious that my ridiculous exaggeration about megawatts of power isn’t even that much more ridiculous than the actual specs of that thing. So, who even needs that? I certainly don’t. Gamers? They probably think they do, and they are likely the ones buying it. Developers? Video renderers? Graphics designers? I don’t know. But putting that many cores on one CPU, and clocking them this high, is sheer madness reminiscing of the Pentium IV era, where they clocked them so high, and with such dye layout, that a piece of malware appeared that deliberately iterated along the central path until it overheated so much the dye cracked, destroying the CPU.

I’m writing this on a Mac Studio M2 Max, which has more power than I realistically need, even for Lightroom, which is extremely demanding, and it’s idling at 38°C during the summer. It never makes a sound, it never overheats, and it’s absolutely and perfectly stable. I actually had the option of getting the model with twice the power for twice the money, which is an excellent deal, and I didn’t bother, because why would I? At some point, more power becomes a pointless exercise in dick-measuring. Sure, bigger is better, but is it, when it’s already half a meter long and as thick as a fire extinguisher? So this is the obvious consequence – Intel tried to win a dick measuring contest with AMD and made itself a meter long appendage, because bigger is better, and now it turned out it’s not fit for purpose? How surprising.

 

Thoughts about computers

I’ve been thinking about computers lately, for multiple reasons, so I’ll share a few thoughts.

It’s interesting how people tend to have weird prejudice about things based on the label. For instance, “gaming” computers are supposedly not “serious”, you shouldn’t buy that stuff if you’re doing serious work with your computer. You should get a “business” or a “workstation” machine.

That’s such incredible nonsense, because what does “gaming” even mean at this point? It’s basically “workstation” with RGB lights. If a PC or a laptop is “gaming”, it usually means a powerful graphics card, overbuilt power supply, high-performance cooling system, and generally high-end components designed to be able to work on 100% indefinitely. What’s a workstation PC? Well, it has a powerful GPU, overbuilt power supply, high performance cooling, high-spec components that are designed to be able to work on 100% indefinitely, only the graphics card has drivers for CAD, which means it supports double precision floating point arithmetics, and it’s designed more “seriously”, which means no RGB and the box looks normal, and not like an alien space ship with glowing vents. It’s also more expensive, in order to milk the “serious” customers for money, which means that if you want to have the greatest performance for your money, get gaming components, get a normal-looking case, turn off the RGB nonsense and there you go. The only situation where you would actually want a real workstation machine is if you’re running server workloads and you actually need a server CPU. Otherwise, “gaming” translates into “great for photo and video editing and programming”.

It’s interesting that everything that tends to be labelled as “business” tends to be shit. That’s because a computer for “business” is the cheapest possible piece of crap that will still work, so penny-pinchers that buy equipment for general staff that slaves their lives away in cubicles for a meager wage can feel good knowing they spent the least possible amount of money on the “workforce”. That’s never the computer the boss is getting for himself. He’s getting a 16” Macbook Pro. He’s certainly not getting the “business” model. You don’t get a business-grade computer if you’re doing business, you get it if you are considered to be equipment required for doing business, and you need to be cheap.

There’s also the question whether to buy a Mac or a “PC”, as if somehow a Mac is not a PC. Let me write down my own experience. I used to write books on IBM T41 laptops running Linux. The keyboards were great, and the screen had lots of vertical space, being 4:3 ratio, so that worked fine, but they tended to die on me, because I used them on my lap, and I didn’t have air conditioning in the room so they overheated during the summer; the motherboards would die, so I would just get another used T41 (they were several generations obsolete and dirt cheap at that point), put in my hard drive and continue writing. At some point, after I went through two IBMs in two years, I decided I had enough of that shit and got a 13” Macbook Air. Now, that thing was indestructible, a proverbial cockroach that would survive a nuclear war. I had to retire it because it had only 2GB of RAM and became unbearably slow after five or so years of use, and traded it in for some new piece of hardware, but there’s obviously a reason why so many people buy Macbooks, and it’s not because they are stupid so they buy “overpriced junk”. If anything, the old Thinkpads were junk compared to the Macbook. I replaced the Air with a mid-2015 15” Pro, which is also a cockroach – I’m using it to write this, it’s 8 years old, I more-less retired it a few years ago and still works just fine. The screen, touchpad and keyboard are still great, but it’s significantly slower than the modern machines, so I wouldn’t do serious heavy lifting on it, but for all the normal tasks it’s just fine. The only interventions I did on it was to change a bloated battery when it was 5 years old, and I replaced the 256GB SSD with a 1TB Samsung. So, my answer to “why would you buy a Mac” is “because I want it to work reliably and well until it’s a million generations obsolete and I want to replace it anyway”. It doesn’t just die, the user interface is great, and it’s usually among the fastest machines you can get, and considering how well it works and how long it lasts, it’s dirt cheap. The only exception are the generations with the touchbar and butterfly keyboard. They were shit, and everybody who got one regrets their decisions.

It’s not that I have some general recommendation, such as “just get a Mac” or “just get a gaming machine”. In fact, it is my experience that, today, the computers are so good you really have many good options, but that’s only if you avoid the “economy” and “business” stuff, which is what junk made of obsolete components sold to businesses at clearance prices is called.

Democratic technology

I have a weak spot for “democratic technology” – meaning that you can be a kid with very little or no money, and still be able to buy it and use it to learn and start getting your initial experience making money. As a teenager, I had posters on my bedroom wall with HP Integral PC, Compaq Portable 386, IBM PS/2 and a HP 28c calculator, when other boys had posters of cars, girls and football stars; you can see where my priorities were. 🙂

I still have a weak spot for good quality pencils, calculators and computers, into which I projected almost magical qualities of compensating for my limitations. The irony is, I ended up in a place where I do almost all the heavy lifting in my head, and use computers as glorified typewriters, but I digress. 🙂

So, what’s the “democratic technology”? It’s basically the stuff you can actually buy and do all the work you would otherwise do on the hardware you dream of, but can’t afford.

Today, democratic technology is a cheap Xiaomi smartphone, a desktop computer you built from cheapest new or used components, running unlicensed Windows or Linux, or a laptop along similar lines, all bought with pocket money, allowing you to access stuff on the Internet that allows you to learn. Interesting, it’s very rarely the stuff that’s designed and marketed as “democratic”, such as a Raspberry pi. I would actually not recommend that as a computer for kids, because it’s seriously underpowered and not inexpensive enough to be worth the effort. You can actually get a used i5 laptop for the order of magnitude of 100 EUR, which would be greatly preferred. This would be something along the lines of a ThinkPad X240 i5-4300u, which would run either Linux or Windows, and you can install an SSD and add more RAM if required. Such a machine could be used to surf the web, learn Python, PHP or C, and basically get you started in a position where you are very low on money, and very high on motivation. Interestingly, laptops seem to be a cheaper solution than desktops, when you add everything up.

Similar examples can be found in other areas as well; photography, for instance. You can buy a ten year old digital SLR with a lens or two, get cheap macro extension tubes from Ebay, use some free raw converter such as RawTherapee, and that will get you started. Heck, you can use a smartphone to learn composition if you can’t afford a proper camera, but I’ve seen things such as a Canon 30d with a kit lens for the order of magnitude of 50 EUR, and that would be a very good way to get you started. What can you do with a 8MP camera and a kit lens? If you can add a cheap tripod, you can do this:

With only a smartphone, you can do this:

Sure, I wouldn’t attempt large magnifications from phone images, but we’re talking about learning here; in that phase, you could take excellent equipment and produce shit, because you don’t yet know how to pick light, don’t understand dynamic range, don’t know how to compose, or even how to hold the camera still. A phone will do for composition, colour and dynamic range; an old dSLR with a tripod will allow you to learn everything else.

It’s not my field of expertise, but with a piece of paper and colour pencils you can learn how to draw, and then use a cheap flatbed scanner to digitise your drawings and use them as illustrations on websites you design, to give your work a unique look. Or you can learn how to draw in some digital tool, such as Inkscape.

Sure, you need to compensate for technical disadvantages with skill and talent, but the “democratic” part of my point is that you don’t “pay to win”; people usually get the fancy gear only after they got rich using the basic stuff everybody has, or can get. If you have something meaningful to say, you don’t need a Macbook Air to write it down; any computer will do. Heck, a smartphone with a bluetooth keyboard will allow you to write books and articles if you don’t have anything else, although I wouldn’t recommend it if you have options. However, after you had been doing that for long enough, you’ll probably start healing your frustrations caused by inadequate gear the way I’m doing. 🙂 Sure, I could do it on a 386. Been there, done that, didn’t really get a t-shirt, but I did get trauma induced by having to delete the unnecessary multimedia files such as moricons.dll and *.wav from a Win3.11 installation in order to be able to fit my code builds on a 85MB HDD, and edit rich text files of the Ventura Publisher in a DOS text editor because the machine simply didn’t have enough RAM or CPU to edit the tables in the GEM GUI. Sure, it can be done, and you can get started and dig yourself out of the pit with very few resources, compensating for the drawbacks of your tools with some ingenuity. However, fuck me if I’ll do it anymore, now that I have the money. 🙂

The Mac Pro problem

Apple seems to have painted themselves into a corner by being perfectly reasonable. You see, they recently updated the Mac Pro tower, and it turned out to be the Mac Studio Ultra in a bigger box. The pci-e slots inside are barely of any use since GPUs are not supported, and the machine is inherently un-expandable; you can’t add more RAM, for instance. So, while this machine is extremely powerful, it’s by no means an open-ended system you can expand to fit your extreme workflow – for instance, trying to do fluid dynamics simulations or something. So, basically, the biggest Mac is great at doing normal Mac things, such as movies and audio, but it doesn’t extend into high-end scientific or engineering workflows. Is that a problem? I don’t think so, and let me explain why.

What Apple did right was design a range of machines that covers their professional and prosumer user base; people who make videos in Final Cut, make music in Logic Pro, or edit photos in Lightroom and Photoshop. They made excellent laptop range that covers everything from writing articles and checking Twitter in Starbucks, to the most complex software design, or audio, video and photography work. Then they made a desktop range that covers all of that and extends deeply into professional studio use, and I don’t think they left even a fraction of their traditional or potential user base uncovered. Heck, even I got a desktop Mac, which I resisted so far because all they had either came built into a display, overheated under load or cost extreme money while offering no benefits over what I had. The drawback of the current Mac lineup is that it doesn’t extend endlessly, and some, like Linus, will whine over that, while I might offer a more constructive approach.

You see, there’s only as much you can expect a desktop machine to do, and M2 Ultra actually exceeds this expectation by far. In order to exceed this ceiling, Apple would have to completely redesign the system that’s perfectly good for 99.9999% of their user base, in order to cater to the affectations of those who actually need a supercomputing cluster but aren’t technologically savvy enough to realize it. In order for Apple to try to artificially meet this almost non-existing demand, they would have to create a completely new architecture centered around a passive bus motherboard and Max/Ultra daughterboard cards that connect over this bus the way two Max chips are edge-connected to form the Ultra; in essence, combine the infinity fabric and PCIe technologies and make it work. The engineering overhead would be immense, and they would only sell a handful of those machines because the demand at such a high-end of computational needs is very slim.

Alternatively, those who actually want to perform complex computations should make their software cluster-friendly (basically, dump workload packages onto a NAS stack, and have the cluster nodes run workload processors that pop packages from the workload stack, process them and push them onto a result stack, and when that is done, have some script go through the results, check their integrity and assemble it all into the end product). Then you could have multiple Mac Mini or Studio devices connected to the LAN, processing the work you give them, and you could extend this infinitely, and the demands on an individual node would be such that you could optimize it for cost-effectiveness by buying the base Mac Mini or Mini Pro devices by the truckload. This kind of work is usually done on Linux nodes, but a modern Mac is so power-efficient that it has genuine advantages for cluster applications.

So, basically, the Mac lineup is truly powerful enough, and it allows for open-ended design for those who need endless computational power; it’s just not open-ended inside a single chassis, which is only a problem if you have unrealistic expectations, or a workflow that is poorly designed, by not breaking down the job into manageable components. Honestly, I could write such a cluster setup myself in less time than it would take me to start whining about lack of power and tasks taking too long.