On the positive side…

There are many nice things I found upon my return to the world of photographic gear, too. At some points I was genuinely admiring the newly produced gear, such as the FE 100-400mm f/4.5-5.6 GM lens, and many of the GM lenses in general, which draw light in such amazingly beautiful ways that I saw this only with absolutely the most high end optics in the past, and if circumstances allow, I will probably end up getting some of those for myself, because they leave me thinking what I could do with something this amazing. I was also seriously impressed with Sony A1II and A7RV cameras, and will probably get the latter for myself, if the finances align. It has a viewfinder that solves the lower resolution and lower refresh rate issues of my A7II, its computer is much faster, the autofocus is brilliant, and the resolution and dynamic range are much improved.

So, the thing is, I don’t actually think that the gear doesn’t matter, nor am I unable to perceive the advantages of better gear. I also think my gear is quite good, including the lenses that would be summarily dismissed by others as dog shit unworthy of being coupled with a decent camera, and some cameras that would be summarily dismissed as amateurish toys.

5d with a “shit” lens

You see, I believe in a thing I call “minimal technical requirements”. Every task has those, and there is usually quite a difference between what they are, and what people think they are. Since I did quite a bit of testing, I discovered that certain things matter far less than one would expect, while others matter far more. For instance, camera sensor’s ability to render colour is of supreme importance. If it doesn’t, I will instantly dismiss it as unsuitable for my uses, and I actually did that multiple times, with all kinds of cameras that don’t get written about, because they don’t deserve to be mentioned. Those are usually found in phones and compact cameras with small sensors, and what they render is inherently so bad and electronically processed in attempt to “improve it”, that I find the result instantly repulsive.

On the other hand, sensors in some small cameras, such as Olympus E-PL1 and Sony R1, had the minimal technical requirements for producing large prints of great colour and detail; if you use them within certain parameters, that is. If I used those cameras to take pictures, it wasn’t because I thought they were poor tools unsuited for the task, and I wanted to make a statement about using shit tools to produce good results. No, it was because I thought they were genuinely good tools.

Sony R1. Obviously not sharp in the corners. 🙂

Are there better ones; sure. However, that’s the thing about the minimal technical requirements. At some point, if the equipment is good enough, nobody will be able to tell what camera or lens made that B2 print at the exhibition. They will see the motive, colours and detail, the picture will be speaking for itself without technical flaws detracting from its beauty, and that’s all that matters. Better equipment won’t improve anything if the lesser equipment was able to meet the minimal technical requirements – and be assured that my technical requirements are quite strict. They are, however, reasonable, and grounded in real use scenarios. I only once made a print larger than the B2 format. B2 is, for the most part, ideal for viewing from normal distances, in either an exhibition or at home. Anything bigger forces you to increase the viewing distance, and that’s not actually helping the experience. The second viewing scenario is the 4K monitor, and that’s the most realistic one today.

not sharp wide open at 35mm

But there’s a catch: image quality isn’t everything. Minimal technical requirements of image quality are only a part of it. If a camera is so difficult to use that you feel it’s struggling against you, it’s simply not a good tool. One can use such a tool regardless, but I eventually end up replacing them with superior ones as soon as possible. It’s just that my opinion of what tool is comfortable and good enough, and some forum’s opinion, might differ greatly. For instance, some people will treat image quality as the greatest priority, and will buy the lens that makes the best possible images. I, on the other hand, like image quality very much, but if a lens is so heavy that all my pictures will be taken with the iPhone because I left the heavy thing at home because I’m not taking that for a ten kilometre uphill walk, then what exactly is the point in having that thing? Using it for special occasions that never happen? That’s why I don’t have a special occasions watch, because I see it as wasteful and pointless. I have a good everyday watch I use for everything, from washing the car and mowing the lawn to dressing up for some occasion. Fuck special occasions. I don’t want a camera or a lens that’s a jewellery piece impractical for daily use, which is why I never buy those “universal” 24-70mm f/2.8 lenses that are huge unwieldy bricks, and also insanely expensive for what they are. I also think that “universal” high performance things are a waste of money, and specialized, more practical lenses are the way to go. For instance, I don’t need wide aperture on a lens that will be used for landscape photography from f/8 to f/22. I can save money there by getting good, light and inexpensive glass for such uses, which also makes my kit light enough to actually use. However, if I’m impressed by some camera or a lens and I think it will actually improve my photography, I will eventually end up buying it. My considerations are practical rather than ideological; for instance, when digital cameras were either too expensive or horse shit, I shot film and produced digital files by scanning. When digital cameras became good and affordable enough, I switched to digital. I have no brand loyalty whatsoever – I use whatever suits my needs. I used Minolta, Fuji, Olympus, Canon and Sony. Currently, it is my opinion that both Canon and Sony are excellent, and I would have no qualms with either. When something is convincingly better than what I’m using, I’ll switch to that in a heartbeat, but I won’t switch if the differences are minuscule or unproven. Basically, my gear choices are defined by how much money I have and what kind of work I intend to do. I also don’t feel a need to appear “professional”, because I’m not. Professionals produce work for others. I see photography as my personal art form, together with writing, and I wouldn’t actually describe myself as an amateur either. If I had to describe what I do, I’d say it’s mental/emotional state photography using mostly nature in high fidelity colour medium. If I say something is good enough, it doesn’t mean it’s good enough for an undemanding casual user who doesn’t know any better. It means it’s actually good enough that I would be unable to get discernibly better results with any kind of gear.

Photographic frustrations

While we’re at photography, I have to mention that I’m hugely annoyed by the fact that everywhere I look on the forums or the YouTube people are exaggerating things into hysteria. By that I mean the extreme and opposite “cults” – on one side, you have those who think they need to have the most technically sophisticated equipment in order to make anything of value, and on the other hand you have the “lo-fi” groups such as lomography, who intentionally screw things up as much as possible technically, and people in those groups are all supporting each other in the most extreme nonsense.

The truth, of course, is that both sides kind of have a point. On one hand, equipment is important, and I often found myself just staring in awe at the beautiful renderings from a high-end lens or a camera, that manages to get parts of the image completely crisp, just to seamlessly flow into toffee-sparkles of blur. However, it is also the case that photography is much more than merely a formulaic thing where you get the best hardware, apply a correct technical procedure and get everything sharp from corner to corner, and you have the perfect photograph. If I had to describe my personal attitude, I’d say that for someone who sees photography primarily as a way to capture my own thoughts and feelings, and not the things in front of the lens, I’m very technical about it. 🙂 So, let me make a small exhibition of photos that combine things that would make people in dpreview forums have a fit.

Equipment: Canon 5d, EF 35-70mm f/3.5-4.5. That’s the lens that’s almost never seen outside of lo-fi circles, because it’s one of the first EF lenses ever made, dating from 1987, where it was sold as the kit zoom for the EOS 650 film camera, the first in the EOS lineup. It is so lowly rated that it’s not even seen as something that deserves testing and rating at all, and putting it on the 5d would be seen as a ridiculous “lomography” move. Let’s see some more pictures I’ve taken with this combo:

The macro shots are taken using the extension tubes. Nothing fancy, just the cheapest stuff from ebay. The results, however, are very much not lo-fi. In fact, I could make prints from the original raw files that would be as big as anything one could realistically print from the 13MP 5d sensor. B2, no problem. B1, possibly, but I’d have to massage them somewhat, but those are all material that can go between 70-100cm on the longer side. Mind you, I’m more interested in color than resolution and sharpness, but there’s plenty of both. Let’s see the next heretical combo: using Olympus E-PL1 micro 4/3 mirrorless pocket camera with its 14-42mm plasticky kit zoom, that would be universally poorly rated:

How about using Sony A7II with the FE 28-70mm f/3.5-5.6 kit lens, that’s always trashed in the reviews as something you should immediately remove from your camera if you want the pictures to be any good:

Those pictures weren’t taken with said equipment because I wanted them to look like shit, or because I didn’t know any better. The files are all B1-print sharp. There’s a saying “if it’s stupid but it works, it’s not stupid”. In this case, if “inferior” equipment creates results that get a green light from me regarding technical quality, maybe it’s not inferior. Maybe, just maybe, you’re just holding it wrong, to paraphrase Steve Jobs. 🙂 Or maybe people tend to lose perspective when they compare gear. For instance, if a lens renders closeups with glowy spherical aberration and ethereal softness, it’s only an “optical defect” if you’re trying to use it where those effects detract from the image. Also, if it’s “only” tack sharp from f/8 to f/16, and you use it for landscape photography, what’s the problem? Also, colors are either ignored or hard to test, but if a lens renders beautiful, crystal-clear and perfectly neutral colors, should that somehow matter less than resolution in conditions you don’t intend to use it for?

I had the misfortune of being forced to produce results in life using whatever was available and working in conditions that would be immediately dismissed as unfit for anything, and this is not just about photography anymore. If you don’t have a hammer, use a rock. If you don’t have perfect conditions, learn how to turn imperfect ones to your advantage. For instance, I learned to meditate in conditions so terrible, that I could later resist all kinds of interference. If everything tries to kill you and fails, you become indestructible. I was always annoyed by people who keep whining about their tools and conditions – they can’t do anything spiritually because they don’t have a perfect guru, and don’t know the perfect technique of yoga. In reality, that usually means they are more interested at finding imaginary flaws in order to justify their inaction and inertia, than they are at figuring out a way to avoid the obstacles and make things work anyway.

I had an experience at the University in early 1992 that changed my perspective on excuses forever. You see, one of the professors had a rule that you can’t be absent from more than 5 lectures in a year, or he won’t allow you to take the exams, basically failing you by default. Before one lecture a girl approached him and gave him a letter of medical excuse for her absence. He said, “Young lady, you misunderstood me. I do not care whether you were absent with or without a legitimate excuse. If you were absent from more than five lectures, you simply cannot have sufficient knowledge to take the exam. Therefore, the reason for the absence doesn’t matter in the slightest”. This clicked incredibly hard – nobody cares about your excuses for failure. You just have to find ways to succeed, because there’s no other way to avoid disaster. It’s basically like climbing a cliff; you have to find a way to do it perfectly and avoid falling, because if you fall, nobody’s going to give two shits that the cliff was slippery or the rocks were crumbly. If you failed for “valid reasons”, you failed and you’re fucked regardless. So get your shit together and figure out a way to make things work and to attain success. That’s probably the reason why the whiny “demanding” people annoy me. They think excuses matter.

 

Photography

I have recently been getting into photography again after quite a large hiatus, mostly in order not to go crazy from following global politics and to give myself a reason to go out even in bad weather and leave the damn Internet behind.

I feel similar to what a bear must feel after waking from hibernation – oh, there’s new stuff around, and some old stuff is gone or changed, but everything is more-less the same. There are great new cameras and lenses around, and the stuff that was considered great when I followed it all is now considered mediocre and obsolete. The gear nerds are still having an anxiety crisis over which $2500 lens is sharp enough wide open to match the 50MP sensor on their new and shiny “professional” $5000 camera, because God forbid something not be sharp enough in the corners, because that’s the only objective measure of photography that counts in the online forums, because everything needs to be objectively measured in order to get a pay-to-win situation. If you actually had to look at the pictures themselves, one might get a heretical thought that twenty years ago cameras and lenses were perfectly good enough to produce beautiful pictures that could be printed as large as a normal wall would take it.

It’s not all bad – that nonsense allowed me to buy some excellent lenses used very cheaply, because people who absolutely had to have the best and newest stuff are dumping the yesteryear’s bright and shiny gear for pittance, and I just scooped it all up with a Muttley snicker.

Something interesting apparently happened while I was absent; it started while I was still very much in the photo gear thing, but it developed further over the years. You see, Canon had a nasty habit of merely warming up their lenses and sensors, the next generation being packed in a more modern and fancy case but not removing any of the optical flaws of the previous generation. The price, however, tended to grow steadily. Since people could do nothing about this, they merely complained; the professional market was split between Canon and Nikon and they both did very similar things. However, at some point Sony, the producer of the most advanced sensors on the market, decided to enter the high-end amateur market with their A7 series of full frame cameras, accompanied with Zeiss-branded lenses that were supposed to evoke money-spending emotions in retired dentists. However, something incredible happened: the professional photographers decided that Canon lenses can be easily adapted to the Sony body, and they work just fine, thus allowing them to get rid of Canon and their stupid bullshit. However, as they migrated to Sony en masse, they started asking for more professional features in the bodies, and for better native lenses, and so Sony, unable to believe their luck, very quickly mobilized their immense resources, and made some of the best glass in the world – starting from the Minolta G heritage, but quickly exceeding it with the modern designs in their “price no object” GM series. They also made high-end A9 and A1 series cameras that are basically mobile supercomputers with incredible processing speed, designed for sports and wildlife photography, but of course primarily targeting all sorts of wealthy geeks who want to “be professionals”. Across a decade, Sony became the new no1, and Canon found themselves in an unenviable position where they had to instantly get their shit together or otherwise Sony will eat their lunch and put them out of business. So, they took the mirrorless thing seriously and created a very good series of lenses and cameras in the RF range, and obsoleting the SLR range and EF lenses. This worked well enough that the high-end market is now almost evenly split between Sony and Canon, with Nikon being the distant third.

That aside, I also found out that I have to retire the Canon 5d. After I used it for more than a decade, Biljana took it over and continued to use it for another decade, but a few days ago I took it in order to test her new macro lens and saw that it’s quite fucked. The AF was showing very bad back focus and erratic behaviour, the light meter overexposed every third shot by seven stops or something, the screen was so washed out one couldn’t make anything out on it, and the viewfinder accumulated so much dust over the 19 years of use, it’s now really bad. All in all, the thing refuses to die outright, but at this point this is actually worse, because all the things that have half-failed accumulate to the point where the camera looks like it’s actively resisting your efforts with its decrepit nonsense. So, it’s being relegated to my camera museum, and Biljana got a Canon RP body, which apparently works great with EF lenses.

It’s actually funny how staying out of some specialised scene and re-joining it after many years gives you perspective.

Linux (in)security

This just came out:

Basically, 9.9/10 severity is a nightmare. RCE means people can execute code on your machine remotely, and 9.9/10 probably means root permissions. This is as bad as it gets. Even worse, the security analyst reporting this says the developers were not interested in fixing it and rather spent time explaining why their code is great and he’s stupid, which is absolutely typical for Linux people.
Canonical and Red Hat confirm the vulnerability and its severity rating.
So, when Linux people tell you Linux is better than Windows and Mac, and everybody should switch to it, just have in mind that an open source project was just caught with its pants down, having a 9.9/10 severity remote code execution bug FOR A DECADE without anyone noticing until now.

Edit: It turned out it’s not super terrible. The vulnerability is in CUPS, and the machine needs to be connected to the Internet without firewall in order for the attack to work, which is not a normal condition, however the CUPS code has more holes than Emmentaler cheese and uninstalling cups-browsed is recommended.

Old computers

I’ve been watching some YouTube videos about restoring old computers, because I’m trying to understand the motivation behind it.

Sure, nostalgia; people doing it usually have some childhood or youth memories associated with old computers and restoring those and running ancient software probably brings back the memories. However, I’ve seen cases where an expert in restoring ancient hardware was asked to recover actual scientific data recorded on old floppy disks (IBM 8”), stored in some data format that was readable only by an ancient computer that no longer exists, and it was an actual problem that had to be solved by getting an old floppy drive to connect to an old but still reasonably modern computer running a modern OS, and communicating with the drive on a low enough level to access the files and then copy them to modern storage. Also, they recovered ancient data from the Apollo era by using a restored Apollo guidance computer to read old core memories and copy them to modern storage for historical purposes. Essentially, they recovered data from various museum pieces and established what was used for what purpose. They also brought various old but historically interesting computers, such as Xerox Alto, to a working condition, where all their software could be demonstrated in a museum. So, there’s the “computer archaeology” aspect of it that I do understand, and that’s perfectly fine. However, it’s obvious that most old computers that are restored end up being used once or twice and then moved to some shelf, because they are not really useful for anything today. The interesting part is, there are some very old machines that are being actively used today, and they actually do the job so well there is no reason for replacing them with the new equipment, because they obviously do what they are designed to do perfectly (for instance, supervising a power plant or running a missile silo) and since modern hardware doesn’t run the old software, you can’t just replace the computer with a new faster model that you plug into the rest of the system. No; the interfaces are different now, everything is different. You can’t just plug the modern workstation PC in place of a PDP 11. You’d need to move all the data from tape drives and 8” floppies and old hard drives first. Then you’d have to replace the printers and somehow connect to the old peripherals, for instance the sensors and solenoids. And then you’d have to rewrite all the old software and make it so reliable that it never breaks or crashes. And the only benefit of that would be to have more reliable hardware, because the stuff from the 1970s is 50 years old and breaks down. It’s no wonder that the industry solved the problem by simply making a modern replacement computer with all the old interfaces, with modern hardware running an emulation of the old computer that runs all the ancient software perfectly, so that it keeps doing what it was designed to do but without old capacitors and diodes exploding. There are examples of this approach that made their way to consumer electronics – for instance, modern HP 50G or HP 12C calculators have an ARM CPU running emulation of obsolete proprietary HP Voyager and Saturn processors, running all the software written for the original obsolete platform, because rewriting all the mathematical stuff in c and building it for a modern micro-controller platform would be prohibitively expensive since there’s no money in it. However, simply using modern hardware, writing an emulator for the old platform, and using all the legacy software works perfectly fine, and nobody really cares whether it’s “optimal” or not. Now that I think about it, there must be tons of legacy hardware embedded in old airplanes and similar technological marvels of the time, that are still in use today, and maintaining the aging electronics must be a nightmare that can’t be solved by merely replacing it with the new stuff. In all such cases, emulating the old hardware and running everything on an ARM or building a gate-accurate FPGA replica and just connecting all the old stuff to it to buy time until the entire machine is retired, is quite a reasonable solution to the problem. There must be a whole hidden industry that makes good money by solving the problem of “just make a new and reliable computer for it and leave everything else as it is because it works”.

So, I can imagine perfectly well why one would keep a PDP 10, VAX 11 or IBM 360 running today, if the conversion to a modern platform is cost-prohibitive. However, that got me thinking, what’s the oldest computer I could actually use today, for any purpose.

The answer is quite interesting. For instance, if I had a free serial terminal, VT100 or something, and had a workshop with a Raspberry Pi or some other Linux server, I could connect the ancient terminal to it and display logs and issue commands. It could just work there for that single purpose, and perhaps be more convenient than connecting to the linux server with my modern laptop in a very filthy environment. However, I don’t really know what I would do with a much more modern machine, such as an original IBM PC, or the first Macintosh. They are merely museum pieces today, and I can’t find any practical use for them. So, what’s the next usable generation? It would need to have connectivity to modern hardware in order for me to be able to exchange data; for instance, I could use a very old laptop as a typewriter, as long as I can pull the text I wrote out of it and use it on a modern machine later on. Ideally, it would have network connectivity and be able to save data to a shared directory. Alternatively, it should have USB so I can save things to a thumb drive. Worst case, I would use a floppy disk, and I say worst case because the 3.5” 1.44MB ones were notoriously unreliable and I used to have all kinds of problems with them. It would have to be something really interesting in order for it to be worth the hassle, and I’d probably have to already have it in order to bother with finding a use for it. For instance, an old Compaq, Toshiba or IBM laptop running DOS, where I would use character-graphics tools, exclusively for writing text.

But what’s the oldest computer I could actually use today, for literally everything I do, only slower? The answer is easy: it’s the 15” mid-2015 Macbook Pro (i7-4770HQ CPU). That’s the oldest machine that I have in use, in a sense that it is retired, but I maintain it as a “hot spare”, with updated OS and everything, where I can take it out of a drawer, take it to some secondary location where I want a fully-functional computer already present, not having to assume I’ll have a laptop with me. When I say “fully functional”, I don’t mean just writing text, surfing the web or playing movies, I mean editing photos in Lightroom as well. The only drawback is that it doesn’t have USB C, but my external SSD drives with photo archive can be plugged into USB A with a mere cable replacement, so that would all work, albeit with a speed reduction compared to my modern systems. So, basically, a 4-th generation Intel, released in 2014, is something I can still use for all my current workloads, but it’s significantly slower, already has port compatibility issues with the modern hardware (Thunderbolt 2 with mini-DP connectors is a hassle to connect to anything today as it needs special cables or adapters), and is retired, to be used only in emergencies or specific use-cases.

I must admit that I suffer from no nostalgia regarding old computers. Sure, I remember aspiring to get the stuff that was hot at the time, but it’s all useless junk now, and I have very good memory and remember how limited it all was. What I use today used to be beyond everybody’s dreams back then – for instance, a display with resolution that rivals text printed on a high-res laser printer, with the ability to display a photograph in quality that rivals or exceeds a photographic print, and the ability to reproduce video in full quality, exceeding what a TV could do back then. I actually use my computer as a HiFi component for playing music to the NAD in CD quality. Today, this stuff actually does everything I always wanted to do, but the computers were vehicles for fantasy rather than tools to actually make it happen. I can take pictures with my 35mm camera in quality that exceeds everything I could do on 35mm film, and edit the raw photos on the computer, with no loss of quality, and with no dependence on labs, chemicals or other people who would leave fingerprints on my film. So, when I think about the old computers, I can understand the nostalgia about it, but the biggest part, for me, is remembering what I always wanted computers to do, and feeling gratitude that it’s now a reality. The only thing that’s still a fantasy is a strong AI, but I’m afraid that the AI of the kind I would like to talk to would have very little use for humans.