Computer issues

I had quite a bit of computer issues lately, and mostly with my main desktop PC (a Ryzen 5900x/Nvidia 1080ti gaming/workstation system): most of them were stupid things, like something misconfigured in UEFI/BIOS after an update that made the PC wake from sleep randomly, or Windows “fast startup” not allowing the computer to sleep, but the worst are the stability issues – random crashes and BSOD, and I’m afraid all points to the CPU, which is either overheating, or shows stability issues due to heat damage. The root cause of this seems to be an AIO watercooling unit that has a nasty habit of leaving a coin-sized spot on the middle of the CPU contact-free, probably because something deforms when the pump is screwed onto the CPU too tightly or something of the sort. In any case, the computer randomly crashes from once in a few days to several times in a day, and this basically makes the computer just unreliable enough for me not to be able to use it for anything serious. Also, the fact that Windows 11 looks more like a perpetually self-updating machine than anything else, also contributed to my annoyance, and the machine is so power-inefficient that it significantly heats the room in summer, forcing the AC to work harder and thus waste even more power.

While this machine will eventually get fixed when the parts arrive, I decided to do a side-grade (a word for something that’s neither an upgrade nor a downgrade, but replacement with something different but equally powerful) and replace it in the function of my main computer by a Mac Studio.

I was quite busy with the transition – first I tried to unsuccessfully resolve the Windows machine’s stability issues by formatting and reinstalling the OS and all the apps, then had more stupid Windows issues when I tried to move the NVMe from one socket to another and the Windows refused to boot after that, so I had to fix the issue from the recovery console and in the most obscure ways possible, then the Mac arrived and I eventually had to do a clean install because Lightroom didn’t want to work when I did a recovery from another computer’s backup drive, then all kind of obscure things had to be installed, and I basically spent several weeks dealing with computers and the pointless issues they caused. For instance, there is a bug in the Apple Mail application on the Mac OS Ventura and it is widely reported on the web, but nobody in Cupertino seems to be working on it, probably because they don’t think it exists, because it doesn’t show if you upgraded from an earlier OS version, however when you do a clean install, as I did, the junk mail controls and custom filters don’t get saved and are lost on app restart. I fixed the problem by copying 3 files from my laptop:

~/Library/Mail/V10/MailData/RulesActiveState.plist
./SyncedRules.plist
./UnsyncedRules.plist

This is a trivial issue, but each such thing takes an hour or two to diagnose and fix, and I feel as if I’ve been reduced to fixing pointless computer shit and doing very little productive work the computers are meant to do, and I also need to maintain quite a bit of IT skill just to keep everything running, and moving to a Mac might reduce at least some of this pointless hassle, because I don’t think it can be outright removed without abandoning the whole thing.

In the meantime, the Mac is silent, blows out cool air under load, doesn’t use almost any electricity in normal work, updates much less often than the Windows box, and is as blazing fast as that 12-core Ryzen monstrosity with water cooling. Also, the DAC on its headphone jack is absolutely stellar, audibly better than the Schiit Modi 3 I’ve been using to connect the Windows box to the NAD. The second great thing about the Mac are the ports – there are lots of them, and they are high speed, which kind of matters, because the Windows machine only had one USB A 3.1 Gen 2 port, and one USB C connector of the same speed; all the other ports are slower. On the Mac, the slowest USB ports are USB 3.1 Gen 2, it has both USB A and C varieties, and the fast ports are Thunderbolt 4, allowing me to connect external NVMe drives at speeds equivalent to the internal drive. I already have a 4TB NVMe and the enclosure for it in the mail, and that’s going to be the storage drive for Lightroom. The drawback is that all the upgrades are necessarily external, because it doesn’t have any internal upgrade ability. I would expect to be able to at least change/add NVMe drives and RAM, but no, this thing is as upgradable as an iPhone. That’s a real shame and a continuation in a long line of steps backwards Apple was taking, from fully upgradable machines where you could replace hard drives and RAM modules, to this. However, to be honest, the thing with the computer industry is that upgrades are no longer much of a concern these days, and you can take a five or even ten year old computer and it will run fine. Just remember that ten years ago it was 2013, and the computer of the day was this. I have a mid-2015 version of that, and it’s noticeably slower than the modern hardware, but still runs everything I throw at it just fine. Basically, I replaced it long before it became defunct, and that’s the thing – you can take a 8GB RAM, 256GB SSD machine from ten years ago, and it will still run a modern OS, run modern apps, and upgrading the storage and RAM won’t really solve the main reasons why you might want to replace it. There were times when you had to constantly upgrade your machine just to keep up, and the upgrades were truly huge and relevant every six months or so. This is no longer the case, and a modern high-end machine might actually not need internal upgrades in its expected life cycle of five years.

Whether this civilisation will last that long, is a much more important question.

Microwave injury

Tue 18 Oct 2022 I woke up with something that resembled a bad sinus headache with vertigo and weakness. It turned out that everybody I asked had similar symptoms, but I seemed to be hit the hardest. I concluded that those symptoms can have two most likely causes; one is a very strong, generic broad-band astral impact upon the pranic/physical boundary. The other likely cause is a very strong microwave source, because I experimented with microwaves of various frequencies and they vary from near-imperceptible to a very strong interference on the physical tissues that interface with the astral, and it’s very difficult to differentiate between the two because they strike at the same layer, but from opposite sides, and if the astral strike doesn’t carry information, only an energetic impact, the two would be indistinguishable. Today, Robin told me that he didn’t perceive anything in Australia at that time, and he would most certainly perceive an astral impact of this magnitude. If it were a microwave event, however, he wouldn’t perceive anything as microwaves don’t propagate well over the horizon, or through rock. This makes me put much greater Bayesian weight to the microwave option; most likely, a military radar was turned to high power mode somewhere in Europe, and quite possibly inside or close to Croatia, during the NATO nuclear exercises. It is not unreasonable to hypothesise that they turned the radars to high power mode which would have them detect small stealthy objects, such as a stealthy nuclear-tipped cruise missile, or see stealthy fighter-bombers at a greater than usual distance.

The problem with this is that this event left me with physical consequences similar to those of a strong concussion or a mild stroke, and it was strongly felt by a number of people who wouldn’t be expected to feel anything subtle so strongly. This implies that the power level of this thing was almost lethal to humans, leaving unknown levels of permanent damage, and is similar to the military high-power sonars that cause inner-ear bleeding in the whales and dolphins, and have them strand themselves and die.

The only way I know of that would protect one from such a microwave radiation event is to seek shelter inside an underground garage, basement or any similar facility where you would normally have no cellphone and wifi coverage, or inside a grounded Faraday’s cage. I have no such shelter here on Hvar so I was basically right in the open for this one.

The ever tightening grip

I watched this video last night:

Basically, with Windows 10 it was “recommended” that you turn on the UEFI encryption keys and the “trusted platform” stuff. In Windows 11, it was a precondition for installation. Now they are planning to build something into the CPU itself, so that you can’t run an OS that hasn’t been approved by Microsoft, basically. What the author of the video didn’t say, and what I find glaringly obvious, is that this isn’t about Microsoft, it’s about America. They want to make sure that “their technology” can’t be used by anyone on their sanctioned entity list, because, if you pay attention, you will see that Microsoft, Apple, Google and similar extensions of NSA routinely sanction countries that refuse to bend over to America, by the principle of “if you refuse to be our slaves we’ll take our toys away”. Let’s say that Macs and iPhones outright refuse to work in any truly sovereign country. You take a thumb drive with Linux, install it, set it up and take a slight hit in comfort and productivity because the open source stuff isn’t written by people whose pay check depends on all the details being polished. However, you will still get the job done, and in some aspects the Linux way of doing things is actually better. I was actually quite productive on Linux when I had it on all my personal systems; the only exception is photography, because nothing on Linux is even in the same decade as Lightroom. But would I manage; oh yes. And if Windows/Mac didn’t really exist as an alternative, I would venture a guess that excellent Russian and Chinese commercial software would start appearing for Linux in short order. So, things would not only work, but also improve with time.

However, if the Americans succeed in putting this “trusted platform” shit in the CPU, it means that you won’t be able to run Linux or BSD on any American-designed hardware anywhere in the free world (because that’s what the “sanctioned entities list” really is). It’s not unexpected, and I actually think they are kind of late with this, but if anyone thinks Microsoft could lobby to put this stuff in Intel and AMD CPU designs without not only approval, but direct order from the NSA (and probably other deep-state structures as well), I have real estate on the Moon to sell you.

So, what does this mean in practice? Is it worrisome enough to warrant an immediate transition to non-American-designed computer architecture and non-American OS? Yes, if you’re a sovereign state. For individuals, it’s a more complicated matter. It’s worrisome enough for me to warrant building and maintaining redundant systems I can use in case this becomes a problem.

Smartphone to dumbphone?

For some reason I got a few videos about switching from smartphone to dumbphone and back in my YouTube stream, so I actually checked them out because the idea seemed bizarre. It turned out that some people are so overwhelmed by a smartphone that they just can’t leave it alone; they constantly find things to do with it, from social media to all the music and stuff you can listen on it, that their entire lives get absorbed in it. The reason why I find it bizarre is that my iPhone sits somewhere on the desk all day and I use it only for internet banking purposes (because Revolut, for instance, doesn’t have a desktop app so I have to use a mobile device) or when someone calls me; basically, when I’m home, I either don’t use it at all, or I use it for very specific things, the way I use a tootbrush or a coffee cup. When I’m going out, I put it in my pocket and basically forget about it, unless I want to check something. I’m probably the least typical smartphone user; I don’t use social media at all, I don’t listen to music or watch videos on my phone, but I do actually need a smartphone, because when I need it, it’s for checking some website or chat or map or things like that; my “screen on” time on the phone is perhaps five minutes a day, if even that. Still, I do kind of understand the problem people are having with them; it’s just that I get stuck on YouTube, watching hours of political, tech or historical videos, and it’s quite easy to lose the whole day like that. Still, I don’t consider it a loss; I want to keep informed in order to understand what’s going on, and analysis of the kind I’m doing requires keeping tabs on multiple data streams, but I occasionally find myself watching something that’s so far off-tangent that I wonder how I got there in the first place.

In any case, I think I’ve been doing it long enough that I can offer advice on how to manage addictive and time-consuming things on the Internet.

First, you need to be focused, as opposed to scatter-brained, and disciplined, in a sense of being in charge and not just clicking on shit that’s in front of you.

Second, you need to take breaks – take a long walk, or exercise, or something else that has nothing to do with either computers or the Internet. It doesn’t count if you use your phone in any way while doing it.

Third, no using the phone in the car. I can’t even tell you how annoying I find the people who drive while doing something on their phones, not to mention that it’s dangerous.

Fourth, when you’re with someone, talk to them. Don’t even touch the phone.

Fifth, do specific things, and when you’re done, let go of the phone, or the computer. Don’t fidget with it because you’ll always find something on it that will preoccupy your attention and waste your time. It’s a tool, not your connection with God.

Sixth, use an ad-blocker and similar tools for de-cluttering your screen. Don’t watch ads, don’t watch useless “entertaining” garbage, avoid live chats in favor of email and forums. Avoid Internet versions of “hanging out” – if you want to hang out, do it with friends in real life. Avoid functionality that keeps you “tethered”, in a sense that anyone can “ping” you at any time. That just keeps you plugged in and stressed. Turn the chat off unless you actually have something of importance to communicate, or if you expect to find something of importance there. In any case it’s best to write an e-mail. Chats are superficial, addictive, waste of time and for the most part they are disrespectful of other people’s limits and time, and if someone wants to keep you tethered it indicates an insecure personality. Avoid. Also, don’t ping others with useless shit – nobody really cares what you ate, or that you had to take a shit. Communicate important ideas, and if you don’t have any, shut up and read some books, and eventually that will change.

That’s basically it. If you’re scatterbrained, shallow and have an addictive personality, technology will certainly give you enough rope to hang yourself, but it isn’t an iPhone problem, it’s a dumbass problem.

Hardware upgrades

Every time Apple, Intel, AMD or Nvidia launches new gadgets I get a million fake-enthusiastic “reviews” (in fact paid ads produced by youtubers who whore themselves out to the hardware manufacturers) in my recommended videos, and they are always layered – first comes the “oh, a new gadgety thingy, how wonderful”, then “oh, it overheats, is underpowered, there are flaws”, and finally “why you don’t need it and should stick with the last year’s model”, until they decide they milked the product for all it’s worth and shift attention to the next thing. I find it boring and predictable in direct proportion to the faked enthusiasm of the “reviewers”, who are trying very hard to convince us that we live in some parallel universe where good smartphones and computers are something new and unheard of, while the truth of the matter is that the entire consumer electronics industry has peaked quite a while ago and we’re seeing very small incremental improvements. I recently made an experiment where I took several pieces of “obsolete hardware” from boxes and drawers – a 6 year old CPU and motherboard with integrated graphics, an old 120GB SSD, a 4 year old Android phone and so on, because someone in the family always has an old but perfectly functional device they upgraded from, and guess what, it’s all fine. I turned the PC into a local staging server where I test for service and dependency compatibility before I deploy things on the web, and I turned the old android phone into a backup device that I can switch to in emergencies.

The way I see it, a piece of equipment goes through several evolutionary phases; first it’s promising but flawed, and every new iteration of the product brings great improvement and one upgrades immediately after the new product has been released. Then it reaches maturity, where it’s good enough for what you need, and new iterations of the product are better in this or that way, but not enough to warrant an upgrade. The third phase is when the manufacturers introduce changes in design, or control layout, but the functionality of the device is the same, or even reduced to save on manufacturing cost, and after that point all further “improvements” are basically in finding out what they could remove, make cheaper, or introduce intentional design flaws that will make the device break down more quickly and force you to buy a replacement.

I remember times where a 6 months old computer or a digital camera was considered obsolete, because things were progressing that quickly. Now we are at the point where my “new” camera is 7 years old, my “old” camera is 17 years old, both are still in use, and their picture quality is excellent. My mid-2015 15” Macbook pro is still perfectly functional, and I could use it as my primary computer with only a slight reduction in speed from the new one I use. That’s a 7 years old computer, and it’s fine.

That logic doesn’t go forever, though. I would hardly use a Pentium II-233 today, or one of the early smartphones; those are junk and are better recycled for raw materials, than used. Also, I wouldn’t say that there have been no improvements in the last 7 years; however, I recently replaced my younger son’s perfectly good Xiaomi Mi8 with 11T pro, and joked that he now has a typical iPhone user experience, where you buy the new expensive one with better specs, migrate your stuff to it and everything works exactly the same and you feel like a fool for wasting money replacing a perfectly good thing. That’s where we are with computers, too; the last upgrade cycle I did was particularly meaningless, because I replaced stuff that worked fine with stuff that also worked fine, albeit noticeably faster in 5% of cases.

There’s a reason why my most recent tech-purchases were battery-powered lawn mowers: I can actually do things with them that I couldn’t before. With computers and phones, well, nice that they have a new shiny design and color scheme and all, but I’ll pass.