Every time Apple, Intel, AMD or Nvidia launches new gadgets I get a million fake-enthusiastic “reviews” (in fact paid ads produced by youtubers who whore themselves out to the hardware manufacturers) in my recommended videos, and they are always layered – first comes the “oh, a new gadgety thingy, how wonderful”, then “oh, it overheats, is underpowered, there are flaws”, and finally “why you don’t need it and should stick with the last year’s model”, until they decide they milked the product for all it’s worth and shift attention to the next thing. I find it boring and predictable in direct proportion to the faked enthusiasm of the “reviewers”, who are trying very hard to convince us that we live in some parallel universe where good smartphones and computers are something new and unheard of, while the truth of the matter is that the entire consumer electronics industry has peaked quite a while ago and we’re seeing very small incremental improvements. I recently made an experiment where I took several pieces of “obsolete hardware” from boxes and drawers – a 6 year old CPU and motherboard with integrated graphics, an old 120GB SSD, a 4 year old Android phone and so on, because someone in the family always has an old but perfectly functional device they upgraded from, and guess what, it’s all fine. I turned the PC into a local staging server where I test for service and dependency compatibility before I deploy things on the web, and I turned the old android phone into a backup device that I can switch to in emergencies.
The way I see it, a piece of equipment goes through several evolutionary phases; first it’s promising but flawed, and every new iteration of the product brings great improvement and one upgrades immediately after the new product has been released. Then it reaches maturity, where it’s good enough for what you need, and new iterations of the product are better in this or that way, but not enough to warrant an upgrade. The third phase is when the manufacturers introduce changes in design, or control layout, but the functionality of the device is the same, or even reduced to save on manufacturing cost, and after that point all further “improvements” are basically in finding out what they could remove, make cheaper, or introduce intentional design flaws that will make the device break down more quickly and force you to buy a replacement.
I remember times where a 6 months old computer or a digital camera was considered obsolete, because things were progressing that quickly. Now we are at the point where my “new” camera is 7 years old, my “old” camera is 17 years old, both are still in use, and their picture quality is excellent. My mid-2015 15” Macbook pro is still perfectly functional, and I could use it as my primary computer with only a slight reduction in speed from the new one I use. That’s a 7 years old computer, and it’s fine.
That logic doesn’t go forever, though. I would hardly use a Pentium II-233 today, or one of the early smartphones; those are junk and are better recycled for raw materials, than used. Also, I wouldn’t say that there have been no improvements in the last 7 years; however, I recently replaced my younger son’s perfectly good Xiaomi Mi8 with 11T pro, and joked that he now has a typical iPhone user experience, where you buy the new expensive one with better specs, migrate your stuff to it and everything works exactly the same and you feel like a fool for wasting money replacing a perfectly good thing. That’s where we are with computers, too; the last upgrade cycle I did was particularly meaningless, because I replaced stuff that worked fine with stuff that also worked fine, albeit noticeably faster in 5% of cases.
There’s a reason why my most recent tech-purchases were battery-powered lawn mowers: I can actually do things with them that I couldn’t before. With computers and phones, well, nice that they have a new shiny design and color scheme and all, but I’ll pass.