Linux: what it intended, and what it did

There’s been lots of talk about the recent development where the SJW cult apparently took over the Linux kernel development team, forcing Linus Torvalds to sign some LBGTASDFGHJKL manifesto, where meritocracy is decried as a great evil, equality of outcome is praised and white heterosexual men need to be removed in order for the world to be awesome.

To this, my answer is that communism, as usual, is eating its children, and this is nothing new. Linux was originally a communist project and a leftist cesspool, and since the SJW fraction already took over the modern communist movement elsewhere, it would not have been realistic to expect Linux to remain separate from this trend.

To this, I got a reply that Linux did some good things, and it’s not a failure: it powers the server-side, most of the mobile platform, and there are great companies making money with Linux and supporting its development. To this, I wrote an answer I’m quoting below:

Yes, there are companies that made a huge fortune using Linux – mostly those that just sell their services implemented on top of Linux, like Google with Android, but also some involved with Linux itself. If you look at it this way, Linux created both jobs and money. However, there’s an alternative perspective: Linux, by being good enough and free, destroyed the competition. SCO, Solaris, AIX, HP-UX went the way of the Dodo. All the people working on those were presumably fired, and because the competition is Linux, there were no alternative paying jobs waiting for them. Android destroyed the possibility of anyone developing a commercially sold OS for a mobile platform, other than Apple, whose position seems to be safe for now. If Android competed fairly and the cost of development was actually charged to the customer instead of being absorbed by Google and the open source community, with the goal of turning the devices into data-gathering and ad-delivery platform, competition could actually enter the marketplace and interesting things could happen, but this way, the only market pressure is on Apple, the only player who actually plays fairly, by charging money for things that cost money.
When Linux geekboys spout their hate fountains towards Microsoft and Bill Gates, and I’ve been watching that for actual decades, their complaint is that it costs money, and the users of Windows are stupid because Windows are easy to use. The argument against Apple today is the same recycled thing: the devices are expensive so the buyers are idiots and the company is greedy, and the devices are simple to use so the users must be idiots. This looks like all the bad shades of jealousy, hatred, spite and malice blended into a very nasty combination of mortal sins; essentially, they want to destroy companies that are financially successful by sacrificing their time and effort in order to provide a decent but completely free product in order to put the commercial products out of the market, because they hate that someone is rich, and something needs to be done about it.
Basically, Linux is a cancer that destroys the potentially profitable things by competing unfairly on the market, because it pays its developers in ego trip, hatred and envy instead of money, and its goal is essentially to make everything it touches inherently unprofitable. True, some managed to profit off of that, like Google who used the modified Linux to power its ad-delivery platform, as well as its server farms, but that was done by means of taking power away from the customer, because you’re not really the customer if you’re getting a heavily subsidised product, by turning the former customers into a product that is sold to the real customers: those that buy ads.
So, essentially, what Linux did was provide leverage that manages to pump wealth away from the software developers and into the pockets of ad sellers, making the customers less influential and less empowered in the process.
Also, what needs to be looked into is how much of the cloud computing boom is due to Linux, because it’s easy to have a supercluster if your OS is free; try paying Oracle per CPU for a Google or Facebook farm and you’ll get a head-spinning number that would probably make the entire thing financially unfeasible. This way, it’s another lever for centralising power over the Internet and over the end-users, essentially replacing the distributed nature of Internet itself with large corporations that, essentially, are the Internet for most people, and which, of course, are now starting to assert political and societal influence and controlling what people are allowed to think and say.
And in the meantime, the Linux crowd still hates Microsoft and dreams of a world where they’ll finally show it to Bill Gates who dared to charge money for Windows.

My desktop computer

Since I already started talking about computers, I’ll tell you what I’m using.

This is my desktop PC:

I built it myself, as I always do; I optimized it for silence first and power second. Silence wise, it’s built in Fractal Define C case, with Seasonic FX 850 Gold PSU in hybrid mode (which means the fan is off until it is really needed), there’s a huge CoolerMaster 612 v2 CPU cooler which is massive enough that the fan doesn’t really need to spin fast unless I’m pushing it. The GPU is Asus ROG Strix 1080ti, which is silence-optimized so the fans don’t spin at all in normal use, and even under full load all you hear is a whisper.

The CPU is a i7-6700K with 32GB RAM, SSD drives and a HDD. In normal use, the HDD’s whisper is everything I hear; the fans are tuned to work below audible threshold. Under full load, the fans are set up to get rid of heat as quickly as possible, silence be damned, and the top of the case is a dust filter, so hot air can rise up via convection, and since this is an effective method, the fans are never really that loud.

This is my desk. The monitor is LG 43UD79-B, the 108cm 4K IPS unit, which is the reason why I had to upgrade the GPU; Lightroom was rendering previews very slowly in this resolution, and since this operation is GPU-driven, I got the overkill GPU, and once I did that, I said what the hell and got the Logitech steering wheel so I can use it as a racing sim. The keyboard is Roccat Suora FX mechanical RGB, the mouse is Logitech G602. The microphone is Rode NT USB unit, which I use for skype. You can see the 15″ Macbook pro on the left, and misc gadgets and remotes on the right.

The machine runs Windows 10 as host, and several virtual machines with different configurations; the main one is Ubuntu Trusty Mate which I use for writing scripts and all the Unix work. The main reason why I got such a big monitor is so that I can always have one eye on the work-related chat on the right, while I do other things on the left. Also, I like the way my photos look on a really big screen, which approximates print size of a meter in diagonal. The entire rig is hooked to a UPS, so I don’t have to worry about losing work due to power outages or spikes, which, fortunately, happen only once or twice a year on average.

Essentially, this is a rig that “just works”, and it’s where I spend most of the day.

The era of a super-desktop PC

I read something interesting in a computer magazine, I don’t know exactly when, late 1980s, early 1990s perhaps, that the concept of a “home computer” is going to become obsolete, not because there won’t be any home computers, but because there will be too many for the term to make any sense – like, which one, the one in the microwave, in the TV, in the HVAC thermostat, in the networking router… and it actually went farther, so now we have not only the computerized appliances, but also computers in many shapes and user-interface paradigms; voice-controlled watches, phones, tablets, tablet-laptop hybrids, laptops, all-in-one desktops and conventional desktops, gaming consoles, and also the super-desktops, also known as either workstations or gaming PCs.

The super-desktop is an interesting category, because it’s usually called just the “PC”, the same as an ordinary unit found in businesses, the word/excel machine, but it’s a wholly different beast, of the kind that was known in the past as either a supercomputer, or a desktop minicomputer, also called graphical workstation. You see, when something can drive several TV-sized 4K displays, run multiple virtual machines at once with no lag, render movies, or process terabytes of other kinds of data, it’s no longer in the same category of things as a machine that is of nominally the same shape, running the same OS, but is weaker than one of its virtual machines.

So, what is a super-desktop, or a “gaming PC”, as they are euphemistically called? What is a machine that can drive an Oculus Rift VR system? The most honest description is that it is an alternative reality creation device. It creates simulated universes you can interact with and join. If you run a car racing simulation and you wear Oculus VR goggles, and especially if you have one of those seats that re-create mechanical shocks, you are essentially joining an alternate reality where you participate in a very convincing and physical activity, much more so than a dream, for instance.

So, what is the main difference between this and an ordinary computer that can play immersive games? Only quantity, but the thing is, if you increase quantity far enough, it becomes a quality of its own. If you increase the mass of an asteroid enough, it becomes a planet. If you increase the mass of a planet enough, it becomes a star. If you increase the mass of a star enough, it becomes a black hole. It’s the same thing as with human brain – add more neurons and suddenly completely new phenomena start taking place. Have only a few, you have a worm. Add more, you have a fish. Add more, you have a frog. Add more, you have a lizard. Add more, you have a rat. Add more, you have a monkey. Add more, and you get a man, and suddenly it’s no longer just the mass-equivalent of many worm ganglia together, it’s the phenomenon that can launch robots on Mars, fly cameras near Pluto, observe the beginnings of the Universe, break matter in ways in which only supernovae do, and even know God.
A super-desktop computer is not just a PC, and a PC is not just a glorified Commodore 64. It’s a machine of such power, it can add another dimension to human experience. It can immerse you in a realistic alternate reality where you drive supercars on race tracks, fly fighter jets, or fight dragons. It can literally provide you with a dynamically generated, interactive sensory input, which is a definition of an alternative reality. But there is a danger to that. Alternative reality is another name for illusion, and having such powerful illusion-creating devices at your disposal can allow you to add another layer of indirection between your consciousness and reality.

If it allows you to escape from issues that you are supposed to face and solve, it can also allow you to waste your life. There’s only one tool at our disposal that can do that, and it’s called drugs. Drugs can allow you to escape real issues and bury yourself in a world where there is reward without necessity for achievement. Powerful computers can become a drug-equivalent, a wish fulfillment tool which removes the necessity of achievement from the equation. As all powerful tools, they can really fuck your life up. Also, as all powerful tools, they can allow you to do more and better things.

About computer security

Regarding this latest ransomeware attack, I’ve seen various responses online. Here are my thoughts.

First, the origin of the problem is the NSA-discovered vulnerability in Windows, apparently in versions ranging from XP to 10, which is weird in itself considering the differences introduced first in Vista, and then in 8. This makes it unlikely that Microsoft didn’t know about it; it looks like something that was deliberately left open, as a standard back door for NSA. Either that, or it means that they managed not to find a glaring vulnerability since 2001, which makes them incompetent. Having in mind that other platforms had similar issues, it wouldn’t be unheard of, but I will make my skepticism obvious – long-term-undiscovered glaring flaws indicate either intent or incredible levels of negligence.

The immediate manifestation of the problem, the WannaCry ransomeware worm, is a sophisticated product of the most dangerous kind, the one that apparently doesn’t require you to click on stupid shit in order to be infected. The malware sniffs your IP, detects vulnerabilities and, if found, executes code on your machine. The requirement for you to be infected is a poorly configured firewall, or an infected machine behind your firewall, combined with existence of vulnerable systems. The malware encrypts the victim’s files, sends the decryption key to the hackers, deletes it from the local machine and posts a ransom notice requiring bitcoin payment on the afflicted machine. It is my opinion that the obvious explanation (of it being a money-motivated hacker attack) is implausible. The reason for this is the low probability of actually collecting any money, combined with the type of attack. A more probable explanation is that this is a test, by a nation-state actor, checking out the NSA exploit that had been published by Wikileaks. The possible purpose of this test is most likely forcing the vulnerable machines out in the open so that they can be patched and the vulnerability permanently removed, or, alternatively, assessing the impact and response in case of a real attack. It is also a good way of permanently removing the NSA-installed malware from circulation by permanently disabling the vulnerable machines by encrypting their filesystem and thus forcing a hard-drive format. Essentially, it sterilizes the world against all NSA-installed malware using this exploit, and it is much more effective than trying to advertise patches and antivirus software, since people who are vulnerable are basically too lazy to upgrade from Windows XP, let alone install patches.

As for the future, an obvious conclusion would be that this is not the only vulnerability in existence, and that our systems remain vulnerable to other, undiscovered attack vectors. What are the solutions? Some recommend to install Linux or buy a Mac, forgetting the heartbleed bug in the OpenSSL, which was as bad if not worse. All Linux and Mac machines were vulnerable. Considering how long it took Apple to do anything, and how long it remained undetected, I remain skeptical regarding the security of either platform. They are less common than Windows, which makes them a less tempting target, but having in mind that this is the exact reason why potential targets of state-actor surveillance would use them, it actually makes them more of a target, not by individual hackers, but by potentially much more dangerous people. The fact that hacker-attacks on Linux and Mac OS are not taken seriously, the protective measures are usually weak and reliant on the assumed inherent security of the UNIX-based operating systems. When reality doesn’t match the assumptions, as in case of the heartbleed bug, there are usually no additional layers of protection to catch the exceptions. Furthermore, one cannot exclude a low-level vulnerability installed in the device’s firmware, since firmwares are proprietary and even less open to inspection than the operating systems themselves.

My recommendation, therefore, would be to assume that your system is at any point vulnerable to unauthorized access by state actors, regardless of your device type or protective measures. It is useful to implement a layered defense against non-state actors: a hardware firewall on the router, a software firewall on the device, limit the amount of things shared on the network to a minimum, close all open ports except those that you actively need, and protect those as if they were a commercial payment system; for instance, don’t allow password authentication on SSH, and instead use RSA certificates. Use encryption on all network communications. Always use the newest OS version with all the updates installed. Use an antivirus to check everything that arrives on your computer. Assume that the antivirus won’t catch zero-day exploits, which is the really dangerous stuff. Don’t click on stupid shit, don’t visit sites with hacking or porn-related content, unless you’re doing it from a specially protected device or a virtual machine. Have a Linux virtual machine as a sandbox for testing potentially harmful stuff, so that it can’t damage your main device. Don’t do stupid shit from a device that’s connected to your private network, so that the attack can’t spread to other connected devices. Don’t assume you’re safe because you use an obscure operating system. Obscure operating systems can use very widespread components, such as the OpenSSL, and if those are vulnerable, your obscurity is far less than you assume. However: a combination of several layers might be a sufficient shield. For instance, if your router shields you from one attack vector, firewall and antivirus on your Windows host machine shields you from another attack vector (for instance UNIX-related exploits), Linux architecture on your virtual machine shields you from the rest (the Windows-related exploits), and your common sense does the rest, you are highly unlikely to be a victim of a conventional hacker attack. However, don’t delude yourself, the state actors, especially the NSA, have access to your system on a far deeper level and you must assume that any system that is connected to the network is vulnerable. If you want a really secure machine, get a generic laptop, install Linux on it from a CD, never connect it to the network and store everything important on an encrypted memory card. However, the more secure measures you employ, the more attention your security is likely to receive, since where such measures are employed, there must be something worth looking at. Eventually, if you really do stupid shit, you will be vulnerable to the rubber hose method of cryptanalysis, which works every time. If you don’t believe me, ask the guys in Guantanamo.

Linux failed because capitalism rules

Let me tell you why I have been gradually migrating from Linux on all the machines in my household, from the point where everything ran on Ubuntu Jaunty, to the point where only the HTPC (media player in the living room) runs Ubuntu Mate Trusty, and everything else runs either Windows 10 or Mac OS.

A year ago I bought my younger kid a new PC, because his old Thinkpad T43 was behaving unreliably. Since he didn’t move the laptop from his desk anyway I decided to get him a desktop, a Bay Trail (J1900) motherboard with the integrated CPU. I love those CPUs, BTW. They are strong enough to run all normal tasks one would require from a computer, such as web browsing, playing all the video formats, light gaming and editing documents, they are cheap, they use very little electricity, and the motherboards themselves are tiny mini ITX format.

It’s efficient enough to have passive cooling, although that didn’t work so well in Universe Sandbox, so I mounted a big silent case fan in front of the CPU to keep the temperatures down. Basically, this looks like an ideal general purpose home computer, and is exactly what a huge number of people are getting their kids for doing homework. Also, a huge number of cheap laptops run Bay Trail CPUs, so the installed base is vast. Also, to keep the cost down, one would expect a large portion of users to put Linux on them, since all the non-specific applications such a machine would be expected to run work well on Linux.

Unfortunately, Intel fubared something with the CPU design, specifically, they seem to have messed up something with the power state regulation, so when the CPU changes its power state, there’s a high probability of hanging. Sure enough, a microcode update was issued and quickly implemented in Windows 10. On Linux, a bug report was posted in 2015. This is what happened:

This FreeDesktop.org bug report was initially opened in January of 2015 about “full system freezes” and the original poster bisected it down to a bad commit within the i915 ValleyView code. There was more than 100 comments to this bug report without much action by Intel’s Linux graphics developers when finally in December they realized it might not be a bug in the Intel i915 DRM driver but rather a behavior change in the GPU driver that made a CPU cstates issue more pressing. The known workaround that came up in the year of bug reports is that booting a modern Linux kernel with intel_idle.max_cstate=1 will fix the system freezes. However, using that option will also cause your system’s power use to go up due to reduced power efficiency of the CPU.

In December when shifting the blame to the other part of the kernel, this Kernel.org bug report was opened and in the few months since has received more than 120 comments of the same issue occurring on many different Bay Trail systems.

As of right now and even with the many complaints about this bug on a multitude of systems and Linux 4.5 set to be released this weekend, this bug hasn’t been properly resolved yet.

That article was written in March 2016. It’s now May 2017, and the issue still hasn’t been resolved. Essentially, the problem with Linux is that the kernel development team apparently doesn’t have anyone competent and motivated enough to deal with this kind of a problem. It’s unclear whether they are simply unable to fix it, or they just don’t care about anything anymore, because there’s no ego-trip in it to motivate them. Let me show you what I’m talking about. There’s a huge thread where the users reported the bug, and tried to figure out solutions. One of the responses that looks very much like it came from a Linux developer, was this:

Well done on turning this into a forum thread. I wouldn’t touch this bug with a 10-foot pole and I’m sure the Intel developers feel the same.

Essentially, TL;DR. It was too long for him to read, because brainpower.

Another thing became apparent to me: they all live in an echo-chamber, where Linux is the best thing ever and it’s the only option. Linux is the most stable OS, it’s the best OS, it’s the greatest thing ever. Except it crashes on probably a third of all modern computers deployed, and Windows, which they treat with incredible contempt, works perfectly on those same machines. Let me make this very clear. I solved the Linux kernel problem with the Bay Trail CPUs by first trying all the recommended patches for Linux, seeing that they all failed, installing a BIOS update, which didn’t help, and then I installed Windows 10 on the machine, which permanently solved the problem. Not only that, it made the machine run perceivably faster, it boots more quickly, and it is stable as a rock, not a single hang in a year.

That’s why I gradually migrated from Linux to Windows and Mac. They are just better. They are faster, more stable, and cause me zero problems. The only places where I still run Linux are the HTPC, and a virtual machine on my desktop PC. Linux is so fucked up, it’s just incredible. It looks like you can only go so far on enthusiasm, without motivating developers with money. After a while, they stop caring and find something more rewarding to do, and that’s the point where Linux is right now. The parts that are maintained by people who are motivated by money work. Other parts, not so much. As a result, my estimate of stability of Linux on desktop at this time is that it is worse than Windows 98. It’s so bad, I don’t recommend it to anyone anymore, because it’s not just this one problem, it’s the whole atmosphere surrounding it. Nobody is even trying anymore, it’s a stale product that joined the army of the living dead. Since I used Linux as my daily driver for years, this pisses me off, but there’s nothing I can do about it but hope that Apple will make Mac OS support a wider range of hardware, and make it available as a commercial product one can install on anything, like Windows. That would make desktop Linux completely obsolete, and would be no more than it deserves, because its quality reveals its communist origins: it’s made like shit. It’s a Trabant, a Wartburg, a Yugo. Conceived on an ego-trip, and built by people who can’t be bothered with work. It’s proof that you can’t build a good thing on hatred of those evil capitalists. In order to get people to make really great things, you need to have a free market that rewards the winners with money. Huge, superabundant amounts of money. Bill Gates and Steve Jobs kinds of money.

Oh yes, I almost forgot. A conclusion of my project of installing Linux on an old Mac laptop. I gave the laptop to my kid. Within a month, it became so unstable, with so many different things breaking all at once, like dozens of packages reporting errors, mostly revolving around Python modules of this or that kind, with apt reporting mass breakage of packets, I gave up, backed up his data, formatted the drive and installed Mac OS Sierra on the machine. It’s slower than it should be because the machine lacks RAM (and I can’t add more because it’s soldered on), but everything works. Linux is so unreliable at the moment, it’s useless on desktop.