About computer security

Regarding this latest ransomeware attack, I’ve seen various responses online. Here are my thoughts.

First, the origin of the problem is the NSA-discovered vulnerability in Windows, apparently in versions ranging from XP to 10, which is weird in itself considering the differences introduced first in Vista, and then in 8. This makes it unlikely that Microsoft didn’t know about it; it looks like something that was deliberately left open, as a standard back door for NSA. Either that, or it means that they managed not to find a glaring vulnerability since 2001, which makes them incompetent. Having in mind that other platforms had similar issues, it wouldn’t be unheard of, but I will make my skepticism obvious – long-term-undiscovered glaring flaws indicate either intent or incredible levels of negligence.

The immediate manifestation of the problem, the WannaCry ransomeware worm, is a sophisticated product of the most dangerous kind, the one that apparently doesn’t require you to click on stupid shit in order to be infected. The malware sniffs your IP, detects vulnerabilities and, if found, executes code on your machine. The requirement for you to be infected is a poorly configured firewall, or an infected machine behind your firewall, combined with existence of vulnerable systems. The malware encrypts the victim’s files, sends the decryption key to the hackers, deletes it from the local machine and posts a ransom notice requiring bitcoin payment on the afflicted machine. It is my opinion that the obvious explanation (of it being a money-motivated hacker attack) is implausible. The reason for this is the low probability of actually collecting any money, combined with the type of attack. A more probable explanation is that this is a test, by a nation-state actor, checking out the NSA exploit that had been published by Wikileaks. The possible purpose of this test is most likely forcing the vulnerable machines out in the open so that they can be patched and the vulnerability permanently removed, or, alternatively, assessing the impact and response in case of a real attack. It is also a good way of permanently removing the NSA-installed malware from circulation by permanently disabling the vulnerable machines by encrypting their filesystem and thus forcing a hard-drive format. Essentially, it sterilizes the world against all NSA-installed malware using this exploit, and it is much more effective than trying to advertise patches and antivirus software, since people who are vulnerable are basically too lazy to upgrade from Windows XP, let alone install patches.

As for the future, an obvious conclusion would be that this is not the only vulnerability in existence, and that our systems remain vulnerable to other, undiscovered attack vectors. What are the solutions? Some recommend to install Linux or buy a Mac, forgetting the heartbleed bug in the OpenSSL, which was as bad if not worse. All Linux and Mac machines were vulnerable. Considering how long it took Apple to do anything, and how long it remained undetected, I remain skeptical regarding the security of either platform. They are less common than Windows, which makes them a less tempting target, but having in mind that this is the exact reason why potential targets of state-actor surveillance would use them, it actually makes them more of a target, not by individual hackers, but by potentially much more dangerous people. The fact that hacker-attacks on Linux and Mac OS are not taken seriously, the protective measures are usually weak and reliant on the assumed inherent security of the UNIX-based operating systems. When reality doesn’t match the assumptions, as in case of the heartbleed bug, there are usually no additional layers of protection to catch the exceptions. Furthermore, one cannot exclude a low-level vulnerability installed in the device’s firmware, since firmwares are proprietary and even less open to inspection than the operating systems themselves.

My recommendation, therefore, would be to assume that your system is at any point vulnerable to unauthorized access by state actors, regardless of your device type or protective measures. It is useful to implement a layered defense against non-state actors: a hardware firewall on the router, a software firewall on the device, limit the amount of things shared on the network to a minimum, close all open ports except those that you actively need, and protect those as if they were a commercial payment system; for instance, don’t allow password authentication on SSH, and instead use RSA certificates. Use encryption on all network communications. Always use the newest OS version with all the updates installed. Use an antivirus to check everything that arrives on your computer. Assume that the antivirus won’t catch zero-day exploits, which is the really dangerous stuff. Don’t click on stupid shit, don’t visit sites with hacking or porn-related content, unless you’re doing it from a specially protected device or a virtual machine. Have a Linux virtual machine as a sandbox for testing potentially harmful stuff, so that it can’t damage your main device. Don’t do stupid shit from a device that’s connected to your private network, so that the attack can’t spread to other connected devices. Don’t assume you’re safe because you use an obscure operating system. Obscure operating systems can use very widespread components, such as the OpenSSL, and if those are vulnerable, your obscurity is far less than you assume. However: a combination of several layers might be a sufficient shield. For instance, if your router shields you from one attack vector, firewall and antivirus on your Windows host machine shields you from another attack vector (for instance UNIX-related exploits), Linux architecture on your virtual machine shields you from the rest (the Windows-related exploits), and your common sense does the rest, you are highly unlikely to be a victim of a conventional hacker attack. However, don’t delude yourself, the state actors, especially the NSA, have access to your system on a far deeper level and you must assume that any system that is connected to the network is vulnerable. If you want a really secure machine, get a generic laptop, install Linux on it from a CD, never connect it to the network and store everything important on an encrypted memory card. However, the more secure measures you employ, the more attention your security is likely to receive, since where such measures are employed, there must be something worth looking at. Eventually, if you really do stupid shit, you will be vulnerable to the rubber hose method of cryptanalysis, which works every time. If you don’t believe me, ask the guys in Guantanamo.

Linux failed because capitalism rules

Let me tell you why I have been gradually migrating from Linux on all the machines in my household, from the point where everything ran on Ubuntu Jaunty, to the point where only the HTPC (media player in the living room) runs Ubuntu Mate Trusty, and everything else runs either Windows 10 or Mac OS.

A year ago I bought my younger kid a new PC, because his old Thinkpad T43 was behaving unreliably. Since he didn’t move the laptop from his desk anyway I decided to get him a desktop, a Bay Trail (J1900) motherboard with the integrated CPU. I love those CPUs, BTW. They are strong enough to run all normal tasks one would require from a computer, such as web browsing, playing all the video formats, light gaming and editing documents, they are cheap, they use very little electricity, and the motherboards themselves are tiny mini ITX format.

It’s efficient enough to have passive cooling, although that didn’t work so well in Universe Sandbox, so I mounted a big silent case fan in front of the CPU to keep the temperatures down. Basically, this looks like an ideal general purpose home computer, and is exactly what a huge number of people are getting their kids for doing homework. Also, a huge number of cheap laptops run Bay Trail CPUs, so the installed base is vast. Also, to keep the cost down, one would expect a large portion of users to put Linux on them, since all the non-specific applications such a machine would be expected to run work well on Linux.

Unfortunately, Intel fubared something with the CPU design, specifically, they seem to have messed up something with the power state regulation, so when the CPU changes its power state, there’s a high probability of hanging. Sure enough, a microcode update was issued and quickly implemented in Windows 10. On Linux, a bug report was posted in 2015. This is what happened:

This FreeDesktop.org bug report was initially opened in January of 2015 about “full system freezes” and the original poster bisected it down to a bad commit within the i915 ValleyView code. There was more than 100 comments to this bug report without much action by Intel’s Linux graphics developers when finally in December they realized it might not be a bug in the Intel i915 DRM driver but rather a behavior change in the GPU driver that made a CPU cstates issue more pressing. The known workaround that came up in the year of bug reports is that booting a modern Linux kernel with intel_idle.max_cstate=1 will fix the system freezes. However, using that option will also cause your system’s power use to go up due to reduced power efficiency of the CPU.

In December when shifting the blame to the other part of the kernel, this Kernel.org bug report was opened and in the few months since has received more than 120 comments of the same issue occurring on many different Bay Trail systems.

As of right now and even with the many complaints about this bug on a multitude of systems and Linux 4.5 set to be released this weekend, this bug hasn’t been properly resolved yet.

That article was written in March 2016. It’s now May 2017, and the issue still hasn’t been resolved. Essentially, the problem with Linux is that the kernel development team apparently doesn’t have anyone competent and motivated enough to deal with this kind of a problem. It’s unclear whether they are simply unable to fix it, or they just don’t care about anything anymore, because there’s no ego-trip in it to motivate them. Let me show you what I’m talking about. There’s a huge thread where the users reported the bug, and tried to figure out solutions. One of the responses that looks very much like it came from a Linux developer, was this:

Well done on turning this into a forum thread. I wouldn’t touch this bug with a 10-foot pole and I’m sure the Intel developers feel the same.

Essentially, TL;DR. It was too long for him to read, because brainpower.

Another thing became apparent to me: they all live in an echo-chamber, where Linux is the best thing ever and it’s the only option. Linux is the most stable OS, it’s the best OS, it’s the greatest thing ever. Except it crashes on probably a third of all modern computers deployed, and Windows, which they treat with incredible contempt, works perfectly on those same machines. Let me make this very clear. I solved the Linux kernel problem with the Bay Trail CPUs by first trying all the recommended patches for Linux, seeing that they all failed, installing a BIOS update, which didn’t help, and then I installed Windows 10 on the machine, which permanently solved the problem. Not only that, it made the machine run perceivably faster, it boots more quickly, and it is stable as a rock, not a single hang in a year.

That’s why I gradually migrated from Linux to Windows and Mac. They are just better. They are faster, more stable, and cause me zero problems. The only places where I still run Linux are the HTPC, and a virtual machine on my desktop PC. Linux is so fucked up, it’s just incredible. It looks like you can only go so far on enthusiasm, without motivating developers with money. After a while, they stop caring and find something more rewarding to do, and that’s the point where Linux is right now. The parts that are maintained by people who are motivated by money work. Other parts, not so much. As a result, my estimate of stability of Linux on desktop at this time is that it is worse than Windows 98. It’s so bad, I don’t recommend it to anyone anymore, because it’s not just this one problem, it’s the whole atmosphere surrounding it. Nobody is even trying anymore, it’s a stale product that joined the army of the living dead. Since I used Linux as my daily driver for years, this pisses me off, but there’s nothing I can do about it but hope that Apple will make Mac OS support a wider range of hardware, and make it available as a commercial product one can install on anything, like Windows. That would make desktop Linux completely obsolete, and would be no more than it deserves, because its quality reveals its communist origins: it’s made like shit. It’s a Trabant, a Wartburg, a Yugo. Conceived on an ego-trip, and built by people who can’t be bothered with work. It’s proof that you can’t build a good thing on hatred of those evil capitalists. In order to get people to make really great things, you need to have a free market that rewards the winners with money. Huge, superabundant amounts of money. Bill Gates and Steve Jobs kinds of money.

Oh yes, I almost forgot. A conclusion of my project of installing Linux on an old Mac laptop. I gave the laptop to my kid. Within a month, it became so unstable, with so many different things breaking all at once, like dozens of packages reporting errors, mostly revolving around Python modules of this or that kind, with apt reporting mass breakage of packets, I gave up, backed up his data, formatted the drive and installed Mac OS Sierra on the machine. It’s slower than it should be because the machine lacks RAM (and I can’t add more because it’s soldered on), but everything works. Linux is so unreliable at the moment, it’s useless on desktop.

Linux on a Macbook Air

What do you do with an old late-2010 Core2Duo 1.8GHz Macbook with 2GB RAM, that is no longer able to run the current Mac OS quickly enough? Apple’s recommendation would be to throw it away and buy a new one because it’s about time after 6 years and the hardware probably wore out significantly by now. The second part of the recommendation I have no problem with – since the machine is indeed too slow for running a modern OS with all the applications that I need, I bought a 15” retinabook as a replacement. However, the part where I just throw the old machine away, although all the hardware still functions, it has a very good keyboard, monitor and a touchpad, the battery is above 80% – I don’t think so. So, I tried several things, just to see what can be done.

The first thing I did was boot it from a USB drive containing Ubuntu Trusty Mate LTS 64-bit, to see if it’s actually possible and if all the hardware is correctly recognized. To my surprise, it all worked, completely out-of-the-box and without any sort of additional tweaking, except for one very specific thing, which is the Croatian keyboard layout on a Mac, which is different from the standard Croatian Latin II layout used by Windows and Linux. I tried selecting a combination of a Mac keyboard and Croatian layout in the OS, but it didn’t work. I ended up editing the /usr/share/X11/xkb/symbols/hr file to modify the basic layout:

xkb_symbols "basic" {

    name[Group1]="Croatian";

    include "rs(latin)"

    // Redefine these keys to match XFree86 Croatian layout
    key <AE01> { [         1,     exclam,   asciitilde,   dead_tilde ] };
    key <AE02> { [         2,   quotedbl,           at] };
    key <AE03> { [         3, numbersign,  asciicircum, dead_circumflex ] };
    key <AE05> { [         5,    percent,       degree, dead_abovering ] };
    key <AE07> { [         7,      apostrophe,        grave,   dead_grave ] };
    key <AE11> { [         slash,    question       ]       };
    key <AB10> { [     minus, underscore, dead_belowdot, dead_abovedot ] };
    key <AD06>  { [         y,          Y,    leftarrow,          yen ] };
    key <AB01>  { [         z,          Z, guillemotleft,        less ] };
    key <AD01>  { [         q,          Q,    backslash,  Greek_OMEGA ] };
    key <AD02>  { [         w,          W,          bar,      Lstroke ] };

}; 

Essentially, what I did was reposition z and y, put apostrophe above 7, and question mark/slash to the right of 0. However, the extended right-alt functionality now works as if on a Windows keyboard, so it’s slightly confusing to have the layouts mixed. (ps.: I had to repost the code because WordPress was acting smart and modified the “tags” so I converted it into html entities).

Other than having to tweak the keyboard layout, I had to use the Nouveau driver for the Nvidia GPU, because any kind of proprietary Nvidia driver, either new or legacy, freezes the screen during boot, when xorg initializes. That’s a bummer because the proprietary driver is much faster, but since the only thing I’m about to use the GPU for is playing YouTube videos on full screen, and that works fine, I’m not worried much. Everything else seems to work fine – the wireless network, the touchpad, the sound, regulating screen brightness and sound volume with the standard Mac keys, everything.

Having ascertained that Linux works, I formatted the SSD from gparted, installed Linux, tested if everything boots properly, and copied my edit of the keyboard layout to the cloud for further use. Then, I decided to test other things, wiped the SSD again, and tried to run the Apple online recovery, which supposedly installs OS X Lion over the Internet. Now that was a disaster – the service appears to work, but after you really start doing it, the Apple server reports that the service isn’t “currently” available. After checking online for other users’ experiences, it turned out that it’s “currently” unavailable since early 2015 if not longer, so basically their service is fubared due to zero fucks given to maintenance of older systems.

OK, I found the USB drive containing the OS X Snow Leopard that I got with the laptop, and, surprisingly, it worked great – I installed the Snow Leopard on the laptop but I couldn’t do anything with it because most modern software refuses to install on a version that old, Apple’s own services such as the iCloud and the Apple Store don’t support it, so I just used it to test a few things and I found out that it’s as fast as I remember it when I just bought the laptop – there’s no lag or delays introduced by the newer versions, everything works great, except that the current Linux is a much more secure and up-to-date system than Snow Leopard, so I did the next experiment; I took the Time Machine drive with the current backup of the 15” retinabook running Sierra, and booted from that. It gave me two options – install clean Sierra, or do a full system recovery from the backup. I did the clean install first, and it surprised me how fast the machine was, much faster than my slow El Capitan installation that I was running before finally giving up on the machine, because I had no time for this shit. Then I decided to take a look at what the full recovery would look like. It worked, but it was as slow or slower than the full installation on El Capitan. I tried playing with it but gave up quickly – after getting used to my new machines, it’s like watching paint dry.

I decided to try Linux again, but with a slight modification – instead of running the perfectly reliable and capable, but visually slightly older-looking Mate (which is basically a green-themed fork of Gnome 2), I decided to try the Ubuntu Trusty Gnome LTS 64-bit version, which runs the more modern and sleek-looking, but potentially more buggy and sometimes less functional Gnome 3. Why did I do that, well, because the search function in Gnome 3 is great, and resembles both Spotlight and Windows 10 search function that I got used to in the modern systems, and visually the Adwaita theme looks very sleek and modern on a Macbook, very much in tune with its minimalist design. So, I loaded it up, copied back my modifications of the keyboard layout (which are actually more difficult to activate here than in Gnome 2, requiring some dpkg-reconfiguring from shell). I made a mistake trying to test if the Nvidia drivers work here – they don’t, and I had to fix things the hard way, by booting into root shell with networking (not so much for the networking, but because in the normal root shell mode the drive is mounted in the read-only mode), did apt-get remove nvidia*, rebooted and it worked. Then I installed the most recent kernel version, just to test that, and yes, the 4.2.0-42-generic kernel works just fine. The rest of the installation was just standard stuff, loading up my standard tools, PGP key and the RSA certificates, chat clients and Dropbox, so that I can sync my keepass2 database containing all my account passwords in encrypted form, as well as the articles for the blog.

So, what did I gain, and what did I lose? I lost the ability to run Lightroom, but this machine is too weak for that, and I removed it from the position of a photo editing laptop in any case. The second thing that doesn’t work is msecure, where I have all my current passwords stored in the original form; the keepass file is a secondary copy, so that’s not great. However, Thunderbird mail works, Skype works, Rocketchat works, Web works and LibreOffice works. The ssh/rsync connection to my servers works, all the secure stuff works, UNIX shell functionality works. Essentially, I can use it for writing, for answering mail, for chat, web and doing stuff on my server via ssh. The battery life seems to be diminished from what I would expect, but it’s actually better than it was on El Capitan and Sierra, which seemed to constantly run some CPU-demanding processes in the background, such as RAM compression, which of course drained the battery very quickly and made the machine emulate a female porn star, being very hot and making loud noises. 🙂

I gained speed. It’s as fast as it was running Snow Leopard when I initially bought it, which is great. Also, I have the ability to run all the current Linux software, and I don’t have to maintain the slow macports source-compiling layer in order to have all the Linux tools available on a Mac. I do realize, however, that I’m approaching this from a somewhat uncommon perspective of someone who uses a Mac as a Linux machine that just happens to run Adobe Lightroom and other commercial software; I never did get a Mac to get the “simple” experience that most users crave. To me, if a machine can’t rsync backups from my server, and if I can’t go into shell and write a 10-line script that will chew out some data, it’s not fit for serious use. I run a Linux virtual machine on my Windows desktop where I actually do all the programming and server maintenance, so having Linux on a laptop that’s supposed to be all about “simplicity of use” is not contradictory in any way – to me, simplicity of use is the ability to mount my server’s directories from Nautilus via ssh, and do a simple copy and paste of files. This works better on Linux than anywhere else. Also, the Geeqie image viewer on Linux is so much better than anything on a Mac, it’s not even funny. These tools can actually make you very productive, if you know how to use them, so for some things I do, Linux is actually an upgrade. However, I can’t run some very important commercial software that I use, so I can’t use Linux on my primary setup. That’s just unfortunate, but it is what it is. Linux is primarily used by people who want free stuff, and are unwilling to pay for software, so nobody serious bothers to write commercial software for it. Yeah, theoretically it’s supposed to be free as freedom, not free as free beer, but in reality, Linux is designed by communists who have serious problems with the concept of money, either because they don’t understand it, because they reject it for ideological reasons, or both. In some cases, however, Linux is an excellent way to save still functional machines from the planned obsolescence death they were sentenced to by the manufacturers. Also, it’s an excellent way of being sure that you don’t have all kinds of nefarious spyware installed by the OS manufacturer, if that’s what you care about; however, since I guess that most of the worst kinds of government spying is done by exploits in the basic SSL routines and certificate authorities, that might not help much.

Also, the thing about Linux is that it tries to write drivers for the generic components used in the hardware, instead for the actual hardware implementation. This means you get a driver for the Broadcom network chip, instead for the, I don’t know, D-Link network card. The great aspect of this is that it cuts through lots of bullshit and gets straight to the point, reducing the number of hardware drivers significantly, and increasing the probability that what you have will just work. The problem is, there isn’t much done to assure that every single implementation of the generic components will actually work, and work optimally. In reality, what this means is that if your hardware happens to be close to the generic implementation, it will just work, as it happened to just work on my late-2010 Macbook Air, for the most part. However, if something isn’t really made to generic spec, as it happens to be the case with my discrete graphics, trying to use the generic drivers will plunge you headfirst from the tall cliff of optimism into the cold sea of fail.

So, do I recommend this? Well, if you’re a hacker and you know what you’re getting yourself into, hell yeah. I did it for shits and giggles, just to see if it can be done. Would I do it on a “productivity” machine, basically my main laptop/desktop that I have to depend on to do real work reliably and produce instant results when I need something? That’s more tricky, and it depends on what you do. I used to have Linux on both my desktop and laptop for about 5 years, from Ubuntu Gutsy to Ubuntu Lucid. Obviously, I managed to get things done, and sometimes I was more productive than on anything else. At other times, I did nothing but fix shit that broke when I updated something. If anything, Linux forces you to keep your skills sharp, by occasionally waking you from sleep with surprise butt sex. On other occasions, you get to laugh watching Windows and Mac users struggle with something that you do with trivial ease. At one point I got tired of the constant whiplash experience from alternating between Dr. Jekyll and Mr. Hyde and quarantined Linux into its safe virtualized sandbox where it does what it’s good at, without trying to run my hardware with generic open source drivers, or forcing me to find free shitty substitutes for paying $200 for some professionally made piece of software that I need. Essentially, running Linux is like owning a BMW or an Alpha Romeo – it runs great when it runs, but for the most part it’s not great as a daily driver, and it’s fun if you’re a mechanic who enjoys fixing shit on your car to keep your skills sharp. I actually find it quite useful, since I maintain my Linux servers myself and this forces me to stay in touch with the Linux skill-set. It’s not just an exercise in pointless masochism. 🙂

Griffin having better ideas than Apple

Exhibit A:

breaksafe

Why is that thing not built into the new generation of Macbooks? If you need adapters anyway, why not also adapt to USB C from Magsafe? Provide four Magsafe Thunderbolt 3 ports, and provide adapters to USB C, USB A and Thunderbolt 2. You have elegance, you keep the brilliant Magsafe thing, on all ports, and you can spin the adapters by saying that you made all ports detachable and universal, compatible with all existing port standards. I would actually find this more plausible than the USB C.

About Apple, USB C and standards

I’ve been thinking about the recent, apparently insane product releases from Apple – an iPhone that doesn’t have a headphone jack although a significant usage case for an iPhone is to play music from iTunes, a Macbook that has only one port, for both charging and data, and that port is basically incompatible with the rest of IT industry unless you use adapters, and a Macbook Pro that has only those incompatible ports, has less battery capacity, doesn’t have an SD card slot although its supposedly main target user group are the creative professionals, like photographers and videographers, who use SD cards to transfer images and video from their cameras when they are in the field and don’t have a cable with them.

To add insult to injury, all those products are more expensive than the previous, more functional generation.

I tried to think of an explanation, and I came up with several possible ones. For instance, although Apple pays formal lip service to the creative professionals, they don’t really make that much money from those. When Apple actually did make most of its money from creative professionals, somewhere in the 1990s, they were almost bankrupt and Microsoft had to rescue them by buying half a billion dollars of non-voting shares, and Steve Jobs was re-instated as iCEO (interim-CEO, which is the likely cause of him deciding to i-prefix all the product names). They then started to market to a wider audience of young hipsters, students and wealthy douchebags (as well as those who wanted to be perceived as such), and soon they started to drown in green. Yes, they continued to make products intended for the professionals, but those brought them increasingly smaller proportion of their overall earnings, and were deprioritized by the board, which is basically interested only in the bottom line. And it is only logical – if hipsters who buy iPhones bring you 99% of your money, you will try your best to make them happy and come back for more. The 1% earnings you get from the professional photographers and video editors are, essentially, a rounding error. You could lose them and not even notice. As a result, the Mac Pro got updated with ever decreasing frequency and was eventually abandoned by the professional market which is highly competitive and doesn’t have the time to waste on half a decade obsolete underperforming and overpriced products.

Keeping the hipsters happy, however, is a problem, because they want “innovation”, they want “style”, they basically want the aura of specialness they will appropriate from their gadget, since their own personality is a bland facsimile of the current trends. They are not special, they are not innovative, they are not interesting and they are not cool, but they want things that are supposed to be all that, so that they can adorn themselves with those things and live in the illusion that their existence has meaning.

So, how do you make a special smartphone, when every company out there has something that has all kinds of perfectly functional devices, within the constraint of modern technology? They have CPU and GPU that are slammed right against the wall of the thermal design, they have superfluous amounts of memory and storage, excellent screens… and there’s nothing else you can add to such a device, essentially, unless there’s a serious breakthrough in AI, and those gadgets become actually smart, in which case they will tell you what to do, instead of the other way around. So, facing the desperate need to appear innovative, and at the time facing the constraints of modern technology which defines what you can actually do, you start “inventing” gimmicky “features” such as the removal of the headphone jack and USB A sockets, and you make a second screen on the keyboard that draws a custom row of touch-sensitive icons.

And apparently, it works, as far as the corporate bottom line is concerned. The professionals noise their displeasure on YouTube, but the hipsters are apparently gobbling it all up, this stuff is selling like hot cakes. The problem is, the aura of coolness of Apple products stems from the fact that the professionals and the really cool people used them, and the hipsters wanted to emulate the cool people by appropriating their appearance, if not the essence. If the cool people migrate to something else, and it becomes a pattern for the hipsters to emulate, Apple will experience the fate of IBM. Remember PS/2? IBM decided it’s the market leader and everybody will gobble up whatever they make, so they made a PS/2 series of computers with a closed, proprietary “microchannel” bus, trying to discourage 3rd party clones. What happened is that people said “screw you”, and IBM lost all significance in the PC market, had to close huge parts of its business and eventually went out of the retail PC business altogether. And it’s not that PS/2 machines were bad. Huge parts of the PC industry standard were adopted from it – the VGA graphics, the mouse and keyboard ports, the keyboard layout, the 3.5” floppy standard, plus all kinds of stuff I probably forgot about. None of it helped it avoid the fate of the dinosaurs, because it attempted to blackmail and corner the marketplace, and the marketplace took notice and reacted accordingly.

People like standardized equipment. They like having only one standard for the power socket, so that you can plug any electrical appliance and it will work. The fact that the power socket can probably be designed as better, smaller and cooler is irrelevant. The most important thing about it is that it is standard, and you can plug everything everywhere. USB type A is the digital equivalent of a power socket. It replaced removable media, such as floppy and CD discs, with USB thumb drives, which can be plugged into any computer. Also, keyboards, mice, printers, cameras, phones, tablets, they all plug into the USB socket, and are universally recognized, so that everything works everywhere. Today, a device without a USB port is a device that cannot exchange massive amounts of data via thumb drives. It exists on an island, unable to function effectively in a modern IT environment. It doesn’t matter that the USB socket is too big, or that it’s not reversible. Nobody cares. What’s important is that you can count on the fact that everybody has it. Had Apple only replaced the Thunderbolt 2 sockets with USB C sockets, and kept the USB A sockets in place, it would be a non-issue. However, this has a very good chance of becoming their microchannel. Yes, people are saying that the USB C is the future, and it’s only a matter of time before it’s adopted by everyone, but I disagree. The same was said before about FireWire and about Thunderbolt. Neither standard was widely adopted, because it proved more easy to just make the USB faster, than to mess with another standard which basically tries to introduce yet another port that will not work anywhere else. There’s a reason why it’s so difficult for the Anglo-Saxon countries to migrate from Imperial units to the SI. Once everybody uses a certain standard, the fact that it is universally intelligible is much more important than its elegance.

Recognize those ports? Yeah, me neither.

Recognize those ports? Yeah, me neither.

Yes, we once used the 5.25” and 3.5” floppy drives and we no longer do. We once used the CD and DVD drives and we no longer do. We once used the Centronics and RS-323 ports for printers and mice. We once used MFM, RLL, ESDI and SCSI hard disk controllers. We once used the ISA system bus and the AGP graphics slot. What used to be a standard no longer is. However, there are standards that are genuinely different, such as the UTP Ethernet connector, or the USB connector, or the headphone jack, or the Schuko power socket. USB and Ethernet and PDF and JPEG and HTML are some of the universal standards that make it possible for a person to own a Mac, because you can plug it into the same peripherals as any other computer. It makes the operating differences unimportant, because you can exchange files, you can use the same keyboard and mouse, you can use the same printer, you can plug into the same network. By removing those standard connections and ways to exchange data with the rest of the world, a Mac becomes an isolated device, a useless curiosity, like the very old computers you can’t really use today because you can no longer connect them to anything. Imagine what would have happened if Apple removed the USB when they first introduced FireWire, or Thunderbolt – “this new port is the future, you no longer need that old one”. Yeah. Do you think an Ethernet port is used because it’s elegant? It’s crap. The plastic latch is prone to failure or breakage, the connection isn’t always solid, the dust can get in and create problems – it’s basically crap. You know why everybody still uses it? Because everybody uses it.