While moving from one nest to another (we’re lemmings here; RP it a bit) I realized I still have all computers I ever bought or assembled, except for those that literally broke beyond any hope of repair.
Some are no longer used daily but all work and being on a point in life where everything and anything in the nest needs to have a purpose or a function, led me think what actually renders a computer useless or truly obsolete.
I was made even more aware of this, as I’m in the market to assemble a new machine and I’m seeing used ones - 3 or 4 years old - being sold at what can be considered store price, with specs capable of running newly released games.
Meanwhile, I’m looking at two LGA 775 motherboards I have and considering how hard can I push it before it spontaneously combusts to make any use of it, even if only a type writer.
So, per the title, what makes a computer obsolete or simply unusable to you?
Addition
So I felt necessary to update the post and list the main reasons surfacing for rendering a machine obsolete/unusable
- energy consumption
overall and consumption vs computational power
- no practical use
Linux rule!
- space take up
13 year old kids on Twitter.
IMO a computer is obsolete when it can no longer run any desired programs. My laptop for example has outlived my much beefier desktop since the laptop is basically just used for web stuff while my desktop is used for gaming, development, and the like. Especially gaming has had a significant increase over the years and a gaming PC might be rendered obsolete much faster than something used for the web. My old gaming PC that was rendered obsolete I repurposed to be a server and it works well for its new purpose and will probably live for a couple of years still.
So there isn’t any concrete limit on which you can say a computer has become obsolete. It is more of a subjective assessment of whether the computer can fulfill its tasks to a satisfactory degree.
I’d say that, for me, a computer gets moved down the chain. From a daily driver, to something I use more sporadic, and then on to become a server of some kind hosting light weight stuff on my LAN. And then eventually it becomes a question of if it’s worth the electricity bill having such inefficient old hardware running 24/7.
At the physical level: capacitors age and blow up, batteries stop charging.
At the efficiency level: when the work you want to do uses more energy on an older platform than on a newer platform.
At the convenience level: when the newer device is so convenient you never use the old device, telephone versus desktop as an example for most people.
After reliability level: if you’re constantly replacing things on a unit, where it becomes your part-time job.
The longest used devices tend to be embedded industrial devices. They have a job they keep doing that job and until they break they’re going to do that job forever. And that’s application specific computing.
Most home users are general computer users, so they have a mix of different requirements, support and use cases. I for one still use a 10-year-old laptop. And it’s totally fine.
I have a 12yo MacBook pro at home with some Linux installed that runs perfectly.
Still I have absolutely no use for it. There’s nothing much else it can be used for than browsing the web. And for that I have lighter devices with a much better screen, so I prefer those anytime.
Media server? NAS? Use it to run your sprinkler system?
There are a lot of good suggestions in the replies here, aren’t there?
I was going to say that I’ve been doing a lot of self-hosting and home automation recently, and it’s had me doing things like spending a lot of time finding out if I can run Linux on an old Apple TV, to make it yet another home server running containers. I went through a phase where I was considering disassembling old laptops to re-use their LCD panels as mounted control access points around the house.
However, the LCD thing never went anywhere, because I’m not handy with a soldering iron, but also because I’ve found that those laptops are usually newer than the ones people in my family tend to have (me being in software and having cycled through laptops frequently), and I’ve been re-installing friendlier Linuxes on them and giving them away to friends and family.
I wonder about the other devices, though. Many are certainly not low-power-use, and what’s the impact of me continuing to use them? Headless, most are certainly capable of running at least one containerized service, but a newer ARM or RISCV board will almost certainly sip less. What’s the environmental trade-off?
I have, though, only one tower. I built it in 1993, and have simply upgraded it with new MBs and components over time. It’s main feature turned out to be it’s usefulness as a RAID5 container, again upgraded with increasingly larger HD over the decades, until the point where I stated prodominantly using docked laptops. One move, I simply never set it up again. That one is a power-hungry monster, and I feel bad about having it powered on 24/7. But I still keep it because, sentiment.
Annecdotes aside, my answer to your question is: most computers can run Linux, and therefore, most computers could find a use in self-hosting. For me it’s become more a question if whether I have, or can find, a use for it. Often, a conversation with family results in finding a use; setting up a self-hosted media-server for mom, maybe. If not, it becomes e-waste, and I feel bad for a bit. But my devices have tended to be small form-factor, like Vera or AppleTV; it sounds like yours are larger, and maybe the form factor makes them less desirable to reuse.
oh boy… i ask myself this a lot. i frequent the puppylinux community and dudes are out there slanging 32bit computers with sub 1gb ram all the time… much like others have echo’d, the answer seems to be when the computer dies.
For me, it’s the hardware failure. If it’s damaged enough to be uncomfortable to use, it’s done. Similarly, if it can’t run a modern browser decently.
I just ditched a >10yo laptop that I used as a server. The display was off most of the time, and the battery offered some energy backup. Its last months I couldn’t even use the power button, had to take the mobo battery out and connect it without the battery in order to turn it on. Touchpad wasn’t working either. OS hard drive was failing but that was replaced. I’m sure the thing works fine but I can’t find the right flex cables to connect the power button and touchpad to the mobo. Guess it’s going to trash soon.
Pretty much the software you run on it and the support behind it. And for now, energy consumption, but I can imagine 100 years now that won’t be a factor anymore.
But that’s probably falls under “no practical use”
I mean, with the proper software, you still can automate your house with a Commodore 64, or browse the web with an Amiga
Imma need to see the commodore 64 smarthouse now
I moved to a laptop for my main system for portability, and I’m really enjoying the reduction in my power bill from my previous threadripper 1950x build.
If when you start up a brand new game you just got on Steam, the game on minimum settings grinds down to 10 FPS in the first 5 minutes or refuses to even start? That’s what you know it’s time to put the old girl to rest.
Sounds partly like a cooling issue
Power usage is a massive one for me. I go by £1/W/Year for consumption of always-on devices. (I think it’s more like £3/W/Year now!)
If the 20w new server can do the same work as the 100w server, and will cost me less over 2 years including the purchase price, then the old server is obsolete.
It can’t run Doom.But seriously, I question the “practical use” bit, not because it’s wrong but because it’s so completely situational. If you want it for a business you want to beat AWS prices probably, but if you are just goofing off a replica of the Zeus Z-1 is actually a substantial upgrade from an old XP desktop, just because of the huge cool factor. If you have some sort of basic but fairly practical personal need, the cutoff for usable will be in between.
In your situation, I’d figure out how many you want, and then keep the n best ones by your reckoning.
Shoutout to !retrocomputing@lemmy.sdf.org
The weird thing is, that we’re currently at a point, where even very old machines are perfectly usable, if you’re not playing modern games.
My main computer is an i5 4670 (or something like that), it’s almost 10 years old, but for my Firefox/vs code/docker workload, it’s pretty much about as good as my M1 MacBook. Sure, some tasks take a second longer, but not annoyingly long.
This comment you’re reading brought to you by a laptop from Obama’s first term.
I tend to follow a ‘cascade’ type of upgrade pattern, one gets a new part and the replaced piece gets put into another where possible. At some point though it ends up as to upgrade this one part I need to also swap several others to support it. So it kind of becomes a ‘Ship of Theseus’ situation in many cases.
The real question of when is something useless though, it comes to a combination of a security thing (can I run a modern supported OS in a reasonably performant fashion) and if the function it served can be done on some other existing system (via virtual machines/containers usually) making it entirely redundant.
It’s all very arbitrary and depends on the definition of computer for the individual.
Ultimately it does, I think, come down to practicality. Can I still use this thing to get what I need to do done, and can I still do it securely?
The security part can be more or less important depending on computer, as well. If you’re a Mac person, your machine may be obsolete as soon as Apple decides to stop giving you security updates. If you’re a Linux person, you can probably maintain a secure system easily on 10-15 year old hardware.
When it no longer reliably functions - Older hardware still has a lot of uses, just dump lubuntu on it and you have a functional desktop that you can play older games on, and use open source productivity suites with. However, once parts start to fail that you can no longer replace (those old laptop HDDs for example), it becomes obsolete to you.