Intel SNAFU

Regarding the Intel CPU issues, I must say I expected that; I couldn’t tell which manufacturer will have the issue first, but with the arms race of who’ll be the first to make the 0 nm node dye that draws a megawatt of power and needs to be cooled with liquid Helium, it’s a perfectly logical outcome. If I had to guess, I’d say they made transistors out of too little material at some critical part of the dye, and with thermal migration within the material at operational temperatures the transistors basically stopped working properly.

So, basically, the real question is who actually needs a CPU of that power on a single-CPU client machine? We’re talking 24 cores, 32 threads, 5.8 GHz boost clock, 219W max. power at full turbo boost. This is sheer insanity, and it’s obvious that my ridiculous exaggeration about megawatts of power isn’t even that much more ridiculous than the actual specs of that thing. So, who even needs that? I certainly don’t. Gamers? They probably think they do, and they are likely the ones buying it. Developers? Video renderers? Graphics designers? I don’t know. But putting that many cores on one CPU, and clocking them this high, is sheer madness reminiscing of the Pentium IV era, where they clocked them so high, and with such dye layout, that a piece of malware appeared that deliberately iterated along the central path until it overheated so much the dye cracked, destroying the CPU.

I’m writing this on a Mac Studio M2 Max, which has more power than I realistically need, even for Lightroom, which is extremely demanding, and it’s idling at 38°C during the summer. It never makes a sound, it never overheats, and it’s absolutely and perfectly stable. I actually had the option of getting the model with twice the power for twice the money, which is an excellent deal, and I didn’t bother, because why would I? At some point, more power becomes a pointless exercise in dick-measuring. Sure, bigger is better, but is it, when it’s already half a meter long and as thick as a fire extinguisher? So this is the obvious consequence – Intel tried to win a dick measuring contest with AMD and made itself a meter long appendage, because bigger is better, and now it turned out it’s not fit for purpose? How surprising.

 

4 thoughts on “Intel SNAFU

  1. I mean, don't get me wrong, I like fast computers, but if they managed to make even myself leave that train and say "I don't want what you guys are smoking", then something somewhere went terribly wrong.

  2. I am old enough to remember Pentium IV era and gigahertz race which AMD lost, but then AMD changed the game with new architecture that obliterated P4 so badly no one cared for gigahertzs any more.

    What's different this time is the fact that entire x86 architecture looks quite obsolete at this point, so running another gigahertz race makes absolutely no sense whatsoever because even gamers are eying ARM and it is only a matter of time when AAA titles starts coming out ARM first.

    • I didn't write any Windows code since Win16 API era, but isn't it kinda normal to compile Windows apps as .net bytecode, which should be platform-independent, if DirectX calls function as messages, where DirectX comes with the OS anyway so it shouldn't matter how you actually implement the call from the application? What I want to say, .net should be the universal binary that's supposed to run on anything, unless you intentionally build your application to not run on everything?

      • Everything except GUI is platform independent (x86, arm, windows, linux, macos, whatever) and current .NET code works on ARM just fine (you can even select if you want it to be portable or precompiled for specific platform).

        GUI part is hard because of vastly different approaches on different platforms, however, I am talking here about EVERYTHING and not Windows only.

        If we stick to Windows only, yes, it works just as you described.
        Microsoft ported everything to ARM, including Visual Studio which works surprisingly well – something you usually get on macOS (Rosetta 2 for instance), but it is a big surprise for Windows which is usually all kind of mess and crashes.

        I tried Windows 11 ARM in emulator and started my Windows application that runs on NET 8 and it just works – truly suprising, crash free experience 🙂

        Of course, it won't work on macOS or Linux because there are no official implementations of DirectX translations into Metal or OpenGL (did not check for a while, though).

        However, web application that runs on NET 8 Blazor can be hosted on just about any platform.

Leave a Reply