The advantage of laptops (and of tablets) is their portability (although they’re still as heavy as sin), but that perk is threatened by the Damocles’ sword of their fragility, and by the looming penumbra of the corresponding physical limits to their capacity. Their useful life seems to be about five years, and even if you can coax more time out of them, the changes to the software that’s used on websites eventually exceeds the ability of their memories to cope.
This is because most modern software “improvements” rely upon “enhanced” graphics, which swallow bigger mouthfuls of memory. The bias towards graphics at the expense of genuine utility is a consequence of software developers’ catering to video game players. We now have a generation of users who cut their teeth on Game Boys (in some cases, literally), who seem to need ever more in-your-face visual input, to arouse their jaded sensibilities.
But many of the world’s busiest computer users have no fondness for graphics frippery. Those lovingly crafted 3D swooping and flashing special effects just make us qUeAsY. We like bright colors, and sharp resolution that doesn’t pixelate badly under high magnification (especially if we’re Indie Authors who do our own cover art), but what we like the best is not to have our computers’ memories sucked into a black hole every time we boot up, or go online.
This is less of a problem with desktops and towers, because there’s room inside them to plug in more chips. But eventually planned obsolescence catches up with all operating systems, because their manufacturers stop supporting them. Computing was so much easier years ago, before they came up with the supposedly “intuitive” bells-and-whistles. The system wasn’t broken until they decided to “fix” it. Honestly, I’ve owned toasters that had more genuine intuition.
This leads to a machine that thinks it’s HAL or the MCP, and tries to impose its notions on your work. When I was editing Irish Firebrands, I wasted more time ferreting out and undoing the mistakes made by arrogant artificial intelligence that went behind my back to change the specifications I’d made for my manuscript.
This was not GIGO: I saved identical formatting requirements on the laptop and the desktop, and only worked on a file that I stored on a large-capacity external drive. But invariably, when I opened the document on the laptop, the computer changed my settings to its default preferences. I finally ended up playing both ends against the middle, but to do that, I had to go deeper and make “permanent” changes to templates for as long as I worked on the book. “Compatibility mode,” thy name is mud.
Then the time came when I had to accept that my old tower CPU was not only creaking and groaning with arthritis, but also that it had begun to show signs of Parkinson’s and Alzheimer’s diseases. After nine years and six months of faithful service, it was time to put the poor old thing out to pasture, before it suffered thunderous apoplexy and descended into full-blown dementia.
Now, when I want to visit the old desktop’s cerebrum-hard drive, it hums happily in an external case, and I’m the one who’s creaking and groaning with arthritis – and at risk for apoplexy, as I continue to wrestle with a new operating system that I still detest, many months after the switch. I discovered that the old rule that new operating systems would still read old programs had also gone the way of all flesh, and I had not only to update my most-used software, but also replace a perfectly good printer, because the new OS turned out to be too dumb to decipher how to install it.
Meanwhile, the lobotomized tower with the old CPU’s brainstem-motherboard disappeared into the space-time continuum that is my resident son’s suite of rooms – from whence it may someday emerge, a threat to life as we know it: a zombie computer that will make Frankenstein’s Monster blench.