Apple M1 chip

Here’s my take on it, based on what we know so far.

It’s a modified mobile APU, which means it has both strengths and drawbacks of other Apple’s mobile chips: it produces great computational power and uses very little energy. That much is obvious from the reviews.

The drawbacks are that the peripheral connections seem to be an afterthought. It appears to have two thunderbolt connections, but if you read carefully it turns out that when you connect one monitor to the USB C, the other USB C can’t connect the second monitor, and it’s questionable how much you can connect to it at all, because although they call it thunderbolt, it doesn’t work with e-GPU, and it’s questionable how many PCIe lanes it exposes, if any. Also, the connected monitors seem to mostly work at 30Hz, with 60Hz support being very… let’s say “selective”. Basically, it’s an iPad pro chip slightly tweaked to allow for the peripherals, but the “tweak” in question wasn’t serious enough to support anything even remotely resembling the level of connectivity we normally expect on desktop computers.

Also, they modified the recovery boot menu, completely turning off the option that previously allowed us to boot from an external drive. This means two things. First, if your SSD dies, you’ll need to change the motherboard, you can’t just plug in an USB or Thunderbolt SSD and install the OS there, and continue using the computer. Second, no more installing Linux on a Mac. That there’ll be no BootCamp Windows is already known. They completely locked the hardware in. If they lock the software in as well, a Mac will become a very nicely decorated prison cell.

Also, since the RAM is on the chip itself, that means no RAM expansion. This is a step further from soldering the RAM onto the motherboard, and we’ve only seen this level of non-expandability on smartphones and tablets. One would expect it from an ultrabook, but on a Mac Mini or a Macbook Pro, the inability to change the SSD or upgrade RAM is terrible news. Those systems are so closed off, they feel claustrophobic – no RAM upgrade, no SSD upgrade, peripherals reduced to one monitor, with the other USB C port switching to low capacity mode when that happens, which means the bus is quite constrained and, in lack of another word, anemic.

Having all that in mind, it’s questionable what can one do with all that computational power? It reminds me of my iPhone, which has as much power as a desktop computer, but you are so constrained by all the limitations of the form factor, lack of peripherals, and limitations of the OS, that it remains just a phone that does the trivial phone tasks really, really quickly. For professional use, where you have to consider connecting two external monitors, a fast external storage drive, LAN and similar things, which is what you would actually use to do non-trivial work, the Intel machines are vastly superior, and my recommendation would be to look into the 16″ Macbook Pro, and the new 2020 iMac, which are both excellent. The new machines are good for applications where battery life is the absolute priority, and where you need extreme computational power, but with little or no demands for peripheral connectivity.

Technical issues

I’ve been busy with the technical stuff lately; not only that I had to transcode the database of the forum, to be usable by the new software, and write/adapt parts of the software to fit the needs of the community using it, but I also had a disproportionate number of hardware failures this year. Most of them were bloated or weakened LiIon batteries on phones and laptops, but we also had two NVMe drive failures, including one in the recent days, which was hard to diagnose because I initially suspected some driver or faulty Windows update, but as evidence allowed me to narrow it down, I started to suspect the Samsung system drive, and my confidence in that assessment grew to the point where I preventatively replaced it without waiting it to fail completely and force me to rebuild the system from the ground up. And yes, since I cloned and replaced the drive I had no more system freezes. As in the case with the two failed drives before (Mihael’s Adata SATA drive, and Biljana’s Samsung PCI-E drive in the 13″ Macbook pro), it was controller failure, which produces symptoms so similar it made it possible for me to diagnose it prior to complete failure this time. All in all, I had an increased number of drive failures since we moved away from HDD to SSD technology, and literally none of them were due to NAND wear, which everybody feared initially; it’s always the controller that goes, and it’s the worst case scenario because if you don’t catch it in time, it’s complete and irrecoverable data loss. However, only Mihael’s drive went all the way, because we were late in reacting to it malfunctioning for days, and likely weeks. With Biljana’s laptop, I already had some experience with SSD controller failure so I replaced her drive preventatively and all the symptoms ceased, and I did the same with my own system drive on the main computer. Basically, the symptoms look very much as if the system bus is clogging up and the system events are not going through. When that takes place, I’m starting to suspect the system SSD controller failure. This, of course, puts Apple’s practice of soldering SSD components directly onto the motherboard, so that the SSD can’t be replaced, into perspective. That’s just asking for trouble, because it turns something that can be a simple and straightforward process of “back the old drive up, replace it with a new one, and restore from backup” into motherboard write-off and replacement, and those are expensive. Sure, it can be a welcome excuse for replacing your obsolete computer with a new, more modern one, but in 2 out of 3 cases of SSD failure that I had recently, only one computer was obsolete and ready to be replaced, and two were perfectly good machines that required only a system drive replacement. I am seriously not a fan of having SSD and RAM soldered onto motherboards, because those are the two main things that have to be either upgraded or replaced due to failure, and not allowing for that is just screaming “planned obsolesecence”. It’s like not allowing the GPU to be replaced in a gaming PC, knowing that it’s the first thing that will need to be upgraded in order to keep the machine functional. Sure, I have a habit of keeping the old hardware in use until it becomes completely useless, which means I could occasionally use some sort of a push to buy a new, more capable system, but on the other hand, if I see nothing wrong with the system I’m using, in the sense that it does everything instantly and is not causing me any troubles, why would I have to be forced by the manufacturer to throw it away just because some component went off prematurely? The system I’m using plays Witcher 3 on 4K at 60 FPS, on ultra. It’s not a slow and decrepit system by any stretch of the imagination. If I had to replace the whole computer just because the system drive failed, I would be really pissed, and that’s exactly what would have happened with Apple, if I used one of their fancy-looking machines with SSD soldered on. The only one of their current machines that’s actually designed properly is the new Mac Pro, but that one is so expensive it makes no sense to buy it, unless you hate your money and want to see it burn. Someone will say that you have to pay for quality, but that’s really bullshit since they use the same Samsung NVMe drives I buy off-the-shelf to build my own systems, and based on my experience the drives they use are exactly as likely to fail as any other Samsung drive. So, sure, you can solder it onto the motherboard, but then I want a 5 year warranty on the motherboard with instant replacement in case of component failure, no weaseling out.

Minimalism

I’ve been intrigued by the concept of minimalism, which I see mentioned occasionally.

It’s not a simple issue, because although the first thing that comes up is an uncluttered living/working space, and having a “the fewer the better” approach to things, it’s not necessarily about “less is more”, when you dig deeper. If anything, it’s about reducing dependency, reducing resource drain, and reducing clutter.

However, I’ve seen people who purport to live a minimalist lifestyle, often living in either a very small apartment or even a van, and if there is a trend there, it’s that they compensate for the lack of space and assets with greater investment in time and work, basically having to re-arrange things all the time just to retain a functional environment, and they have to use extensive workarounds to get things done. If I’m watching the people who just prefer to get things done, there’s another trend: they tend to have a large number of various specialized tools, and they don’t care that one could probably do with less; no, they have just the right type of a hammer or an axe or a chainsaw for just that particular type of job, and it’s better. Also, there’s always an inevitable amount of clutter around them, because that’s what happens when you actually do things, but there’s always order to the madness; all the things are normally stored in very specific, often labelled places, and after you’re done with them, they are returned to their specific place. It’s just that you don’t return everything immediately; you leave things around you while you work, and you clean up afterwards. A certain amount of chaos obviously has to accompany the creative process, because you can either focus on what you’re doing, or you can focus on cleaning up, but not both at the same time. Sure, you can do it, but the quality of what you’re doing will suffer. For instance, when I’m writing, I couldn’t care less about the empty cup of coffee or a bag of peanuts on my desk, or if everything is aligned perpendicularly to create the illusion of order. I care about what the keyboard feels like, what the monitor feels like and what the mouse feels like, because that’s what I’m using to create. If there’s a problem with those things, it interferes with my concentration and impedes my creative process, but a certain amount of chaos might actually help, because it doesn’t impose the subconscious stress of trying to keep things orderly all the time.

Also, I could have only one computer, and that would be a minimalist way of doing things, but I don’t; I have multiple computers, specialized for what I use them for. A desktop machine is completely silent, it’s comfortable to use, and it’s powerful. It’s something that just gets things done, and it can cool itself easily even if I push it at 100% for days. Then there’s the 15” laptop, which becomes the primary computer when I’m on vacation. It can do everything the desktop machine can do, except gaming, and you can ask why I don’t just use that for everything, because I could plug peripherals into it and it would do just fine as a desktop machine, but it would overheat, it would be noisier, and it wouldn’t last. So, I’m already at two computers, just for the convenience of not killing the laptop with a 16h/day regime. Then there’s the ultralight laptop, which I use in bed because the 15” is too heavy and cumbersome, or for reading or doing things from a couch or in some weird position. It’s an awfully specialized thing to have a dedicated machine for, but I do, and I find it incredibly convenient, for the same reason I have several types of pocket knives, and several different types of shoes. Sure, you can do everything with just one pair of jeans, and just one pair of shoes, but I find that awfully inconvenient, and although it seems simple and elegant, it forces you to constantly adapt to the inadequacies of the equipment you’re using, and it introduces stress, hassle and just breaks your concentration from the things you actually want to do. Sometimes less is indeed more, but if you ever tried to fix something that unexpectedly broke, you’ll know how convenient it can be to have a certain amount of junk somewhere that can be adapted to fix something. If you don’t have your small personal junk yard, you’ll be forced to go drive to a store every time you need a SATA cable or a screw for the SSD or a nail to hang a painting, or a wood screw to fix something that got loose. No, it doesn’t look elegant, and having capability to create or fix things will not make your place look like Apple store, but at some point you need to ask yourself if minimalism is actually contributing to or detracting from your productivity. So, no, less is not more if you need that spare SATA cable, and it’s definitely not more if your one and only computer unexpectedly died and you don’t have a secondary one to look for possible solutions on the Internet.

That’s where I departed from the conventional interpretation of minimalism, and started thinking about defining it as something more akin to self-reliance, or not depending on others to solve your problems. A minimalist approach in that sense doesn’t consist of having only one computer, and it being an elegant iMac or a Macbook Pro. It consists of using generic components you can source locally to build your own computer, building it in ways that make it easy for you to repaste the CPU, change noisy fans, clean up dust, install and set up the OS yourself, and be able to maintain the whole thing without anyone’s help. Sure, such a box doesn’t look elegant, but it becomes very elegant if you need to take off the CPU cooler and change the paste, because the whole thing isn’t glued in behind the screen. It’s two big screws to remove the side panel, and some more screws to remove the cooler, and everything is big enough to work on comfortably and quickly. Essentially, the more elegant and “clean” things look, the more pain in the arse they can be to maintain if something goes wrong with them, and sometimes you can’t even fix them at all, you’re expected to just throw them away and get another one, because that’s also “elegant” and “clean”.

The same goes for software. The older I am, the more I tend to use the most user-unfriendly, basic tools imaginable, such as connecting to a local server via ssh, connecting to the database via shell tool where I type all the commands manually, with no fancy GUI tools, I type code in pico editor, rsync it to a production server, and it all works the same regardless of what computer I actually use to do it – I couldn’t care less whether it’s a fancy and elegant Mac, or a Raspberry pi board connected to other shit with wires hanging. What is minimalistic and elegant in this approach is that I don’t rely on having lots of secondary shit installed on my computers, and I don’t try to maintain a super-complex software system that is supposed to make things “easier” by complicating everything to the point of a bloated mess. No, make things simple by learning a few tools that work everywhere, and reduce the number of intermediary steps I have to take in order to get things done. You may think that a nice fancy GUI with icons is a more elegant way of getting files across than rsync, but it’s only elegant if it works, and those things have a tendency to break in various creative ways just when you have to do something quickly, and you can spend a whole day trying to fix something that is really not essential to your primary task, fixing some environment instead of writing your code. So, yes, compared to some “elegant” thing such as an iPhone with user interface chimps and cats can be taught to operate, my ideas of simplicity and elegance can seem counter-intuitive, but guess what? I maintain my own mail server, web server, blog and forum without anyone’s help. If something goes wrong with any of it, I fix it myself. If something goes wrong with my computer, I fix it myself, whether it’s a software or hardware problem. If I have to choose between elegance and self-reliance, I pick self-reliance, because “clean” solutions have a nasty tendency to just displace the messy parts of life somewhere else. Also, if I have to choose between practicality and productivity on one side, and simplicity and elegance on the other, I prefer to just get things done and not let minimalism get in the way. That is how I personally see the desirable kind of minimalism: it’s minimizing the number of things that interfere with the creative process.

Hackintosh

I recently experimented with Hackintosh (essentially, a normal PC that has Mac OS installed), and the whole process is intimidating because everybody seems to be giving you a “cookbook” type instructions where you just follow steps without actually understanding what’s going on, and when it works, you end up being no smarter. So, I decided to add the part that’s usually missing.

Basically, it works like this: Mac has specific hardware, such as SMC, that makes it quite different from a PC, and Mac OS gets its basic sensor info and other stuff from the SMC. On a PC, those things are done differently, but if you add a software layer that will trick the OS into thinking it’s talking to Mac hardware, while the software in fact translates the commands and data between the OS and PC hardware, everything will work. Also, there are kernel extensions that trick the OS into thinking some piece of hardware is compatible. This is the complicated part where everybody’s eyes get blurry and they say something along the lines of “fuck all”. However, the good part is that you don’t need to know much about this in order for things to work. You just need to find the “recipe” someone else made for your hardware, copy it to the right place, possibly make adjustments and it will work.

The basic principle is this: there’s a piece of software called Clover, which takes place of your normal bootloader, but it also serves as an intermediary layer that tricks Mac OS. It scans for all bootable drives on your system, exposes them in form of a list, from which you then pick a drive you want to boot. This means that for basic booting into Mac OS, you need a drive with Clover installed, and a Mac OS bootable drive. Everybody is telling you to download Mac OS installation file on a real Mac, enter a few commands to make a bootable USB drive, and suffocate you with technobabble. I have a simpler explanation. Get a Clover ISO somewhere, and burn it onto the USB stick. Get pre-cooked EFI for your hardware. Copy this EFI onto the clover boot drive. At that point, if you connect both the Clover USB stick and a drive that would boot into Mac OS, such as the Time Machine backup drive, boot into the Clover stick, wait for the Clover to give you the list of bootable drives, and boot into the Time Machine system recovery partition or whatever it’s called. It will give you the option to install Mac OS on an empty drive. I assume you already have one, so format it in Disk Utility, exit disk utility, choose to either install a fresh copy of the OS or to restore from backup, go through the steps, and when it reboots, again boot into Clover and pick the right partition to boot into, and after a few steps you’ll have a working system. Theoretically, if your Mac has a standard SATA drive, you could just pull it out of a mac, plug it into a PC, boot into Clover, select the Mac drive and boot into it and you’d have a working Hackintosh. There’s just one more step, and that’s transferring Clover onto your Mac drive, so that you can dispense with the Clover USB stick. Boot into Hackintosh, install a tool called Multibeast, and it will transfer Clover onto your Mac OS system drive, after which point this drive is no longer safely bootable in a real Mac. Then use the Clover configuration tool to mount the EFI, and then copy the EFI cookbook specific for your hardware from the Clover stick to the EFI on the Mac OS drive. Unmount, reboot, pull the Clover stick out, go to the BIOS and select the Mac OS drive as the first boot option, and you should then boot into the Clover menu, and you know what you do from there.

I’m starting to sound as complicated as the guys who are making the Hackintosh instructions, but what I wanted to say is that you need 2 things: a drive that would boot into Mac OS on Mac hardware, and the Clover bootable stick with an EFI cookbook for your hardware. After that point everything starts making sense. The only thing to avoid is putting a drive with Clover EFI into a real Mac. That will make your Mac unbootable until you do a NVRAM/SMC reset, and even that might not work because I haven’t tried.

There’s a reason why it’s called Hackintosh: it’s janky as fuck. The only thing I can think of that’s as unintuitive, creates as much problems without solving any, and wastes as much time, is trying to install Windows 95 or something similar onto modern hardware. Try it once, you won’t try it again. In comparison, Linux is the most intuitive and user friendly thing ever. Also, there’s a much better chance you’ll get all your hardware working in Linux. I’m not kidding. Stuff like Bluetooth/wifi will almost certainly not work, and you better not have a Nvidia GPU, because you can get it to work but will almost certainly suffer stability issues. Also, on a major OS update everything will break.

The reason why you would want to do it is not to get a normal Mac desktop on PC hardware, it’s to get a basic barely-working Mac desktop on PC hardware where you can run things such as the xcode compiler needed to build iOS and Mac executables, and you won’t mind much if you don’t have Airdrop or Bluetooth or if sound doesn’t work. Essentially, it’s a way to get a very fast Mac OS platform for running some obscure Mac OS piece of software that you need for some specific task, do whatever you have to do with it, and then boot back into a normal OS where everything works properly.

About computer upgrades

I’ve been thinking about computer hardware recently, since I had issues with two 2015 Macbook Pros – Biljana’s 13” had a defective SSD and a bloated battery, and my 15” had an even more bloated battery and 256GB SSD which had only 20-30 GB free space. Biljana’s laptop was already retired earlier this year, but I had to figure out what I wanted to do with mine, and ended up finding a very cheap upgrade path. I had a cheap but good replacement battery built in, and replaced the SSD with an adapter and a standard Samsung 970 EVO NVMe drive of 1TB. I decided to upgrade because unlike Biljana’s 13” that was a 2-core 8GB RAM machine, mine is a 4-core i7 with 16GB RAM and a 15” screen that’s perfectly good for photo editing and I had no issues with it other than the battery and a small SSD. Those being fixed now, I’m quite happy with it, which brings me to the main issue: is there a need to upgrade computer hardware regularly anymore, or has technology peaked? Right now I’m using several computers, and none of them are exactly new. My desktop machine is still a Skylake i7-6700K, my laptop is a Haswell i7-4770HQ, my phone is an iPhone 8 plus, the tablet is an iPad Mini 4, and the machine I’m writing this on, an ultralight hybrid Asus UX370UAR, is actually the newest and uses a Kaby lake R i5-8250U. Why am I using technology that’s basically 5 years old? Because it’s not upgradable, in a sense that upgrades don’t make it faster. Sure, you can replace it with something newer, but you don’t gain anything other than greater numbers on multi-core benchmarks; the actual speed and functionality is the same. I tested the new 16” Macbook Pro when I bought it for Biljana, and guess what, it felt almost identical to my 15”, which means I could replace mine with an expensive new thing and it would feel exactly the same. Sure, the touchpad is bigger, the screen is bigger and a bit better, but it doesn’t feel like a big difference.

I also came to an interesting conclusion when I plugged in different things into my desktop peripherals to see if anything is faster than my desktop, and it turned out that the CPU is the least important thing, because I have several machines with similar CPU/RAM/SSD performance, and they all felt laggy compared to my desktop, when I use them for normal desktopy things such as watching Youtube, switching between many apps and resizing windows to fit the big 4K screen. Guess why that was? Because today everything is strongly GPU accelerated, and driving a big 4K display is very speed-sensitive, partially because of the resolution, but mostly because of the physical screen size (43”), which visually magnifies all the problems, and the only two GPUs that worked fast enough not to cause visual lag are my 1080ti and my son’s 2070. Basically, it’s the GPU that makes all the difference, and as far as CPU power goes, the Haswell i7 in my Macbook Pro or the i5-8250U in the ultralight Zenbook are perfectly sufficient for everything I do, provided that they are equipped with enough RAM and fast storage. It’s not that I didn’t test the new 6-core machines; it’s just that I run the multi-core stress on my machines so rarely, that it doesn’t make a difference. However, if someone tells you that GPU doesn’t matter if you don’t play games, and you’re fine with integrated graphics, that’s probably true if you run a 1080p display, but on a big 4K display there’s a big difference. Integrated graphics works in a pinch, but it’s visibly stunted and creates an impression that the machine is much slower than it actually is. Even something like the AMD 270X was too slow for the 4k display, and I’m not really sure what’s enough and what’s overkill. I do know that 1080ti and 2070 are perceived to be equally fast and are great. I don’t know what’s the cheapest GPU that would suffice, because didn’t have many to test, but I would theorize that if something can’t run Valley benchmark smoothly at 4k, it might be too slow for the demands of window manager acceleration as well. Interestingly, the same doesn’t apply for the lower resolutions, because my 15” Macbook drives its own retina display perfectly fine with Intel graphics, and when I plug it into a 1080p monitor, it’s blazingly fast, and yet it can’t run Valley benchmark on those resolutions to save its life. However, on 4K, the only GPUs that are actually fast in Windows are also fast for gaming at the high resolutions. Years ago my recommendation would be to get the worst GPU that can still run your screen at the desired resolution and color depth, because GPU was not important, unless you wanted to play games. Today, my recommendation is the complete opposite: if you want to drive a 4k display or bigger, the GPU is the most important part of your system, and you should get a strong gaming PC as your desktop machine, regardless of how many games you intend to play. It’s just that your display will require powerful GPU acceleration to run smoothly, in everything from web browsing, scrolling to window resizing. However, if you don’t run 4K or 5K displays, you can greatly relax the GPU requirements: integrated graphics, such as Intel 620, will be perfectly snappy at 1080p, and you should only get dedicated graphics for gaming, and if you do GPU accelerated tasks such as video editing.

So, regarding upgrades, it’s all good news: basically, if you have anything that is Haswell or newer, if you have at least 16GB RAM and a fast SSD drive, your machine will run all normal tasks as quickly as a modern machine, providing that your GPU is modern enough for driving your display resolution. If you have specific tasks that require more power than that, well, then these general guidelines don’t apply to you, but all in all, unless your PC is really ancient, you will only need to upgrade when it finally dies, not before. But if your machine actually is ancient, you should definitely try the new generations because they are awesome. I bought an 8-th gen i5 Intel NUC for testing, and that thing is absolutely awesome as a desktop machine, if you’re running it at 1080p. At 4K, it’s marginal; it sucks in Linux and Mac OS, it’s much better at Win10, but still nowhere near the brutal speed of my 1080ti. At this point, Win10 has superior window manager acceleration and driver optimization and will extract the maximum from marginal GPUs.

Someone will say that NUC is overpriced and you can get a Raspberry Pi 4 for much less money, at which point I’ll just roll my eyes. Yes, you can, but the difference in speed is so great it’s not even funny. The NUC runs NVMe and SATA drives, it has an immensely superior GPU, it has socketed RAM which can go up to 32GB, and I tested both so I actually know. Raspberry Pi 4 is fine for web browsing and document editing, it’s great as a console for accessing other Unix systems, or a small home UNIX server (I actually have a 3B+ plugged into my home LAN as a server for rsyncing remote backups and hosting my e-mail database), but it absolutely sucks for anything video-related. It has some kind of GPU accelerated video playback but software support for it is sketchy or outright missing, so it works in some specific video modes and codecs, and completely fails in others, and generally, it’s rubbish for video. NUC, on the other hand, is better at 4K than Pi 4 is at 1080p, and that really tells you something. NUC can run photo editing in Lightroom perfectly fine, and that’s a professional-grade task. It’s my assessment that its speed is identical to that of the 15” Macbook Pro retina in Mac OS (hackintosh), and the benchmarks confirm it. So, that’s one type of a modern machine you can get today: it fits on your palm, it’s as quiet as a Macbook Pro, doesn’t draw much power, it’s blazingly fast, and its only drawback is that you can’t add a dedicated GPU later on, if you decide that you need it; for those cases, a “normal” desktop PC would be better. So, basically, this is the best time ever to buy a PC, because they are for the most part incredibly good. On the other hand, they’ve been as incredibly good for the last 5 years or so. As for the phones, they also peaked long ago: today they are all the same; pick your OS, pick the higher price level to avoid outright garbage, and you’re set. I can’t even force myself to think about them seriously anymore, they are like washing machines. If yours dies and it’s not economical to repair, you just go to the store and pick a new one: if you avoid the cheapest garbage, they are all the same and will work great.