Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?
The server shuts down. Admin adds in few more sticks of ram and powers it on again.
The day is reset and we wake up again from the morning of that day where there was a RAM shortage.
They take some users offline to free up some memory for everyone else
Have you not played Dwarf Fortress? Frame rate goes way down, a situation imperceptible to the dorfs. Then eventually the operator of the machine looses interest, or a oandemic makes the pop count drop, or a combo of those.
Edit; You should read some Greg Egan if you’re into this question.
Data in memory will be offloaded to swap space. I doubt we’d notice any fluctuations since we’re part of the simulation, but externally it could slow to a crawl and basically be useless. They might shut it down, hopefully just to refactor. But again we probably wouldn’t notice any downtime, even if it’s permanent.
deleted by creator
That would be the most pleasant way to go :)
A landscape full of Arcos and waves of boom and bust?
12 meteors, 8 volcanoes and 10 tornadoes incoming you say?
I imagine it shows itself where processes get dropped, whether it’s walking into a room and forgetting what you were doing, losing train of thought mid sentence, or even passing out when you laid down to watch something.
A semi related but enlightening (thought) experiment.
There is a theory that our universe isn’t actually 3D is actually a projection/simulation on the 2D surface of a black hole (aka the big bang). If this were the case, then the practical differences would be almost nonexistent. The exception is the planck length. This is the smallest length that is meaningful. If our universe is 3D, we are extremely far from being able to measure effects anywhere close to the planck length. If it is 2D however, that length appears FAR bigger. It wouldn’t be that far below what our current gravity wave detectors can see.
The effects of this would be similar to a simulation running near its limit. It would be the equivalent to floating point rounding errors.
@aCosmicWave we all just start moving more slowly.
Fortunately I can report that if anything, we"re having RAM added, because everything keeps speeding up as I get older.
Probably a mass removal of the poor - oh wait that’s already happening.
Simply put.
We wouldn’t notice anything.
Our perception of the world would be based only on the compute cycles and not on any external time-frame.
The machine could run at a Million Billion hertz or at one clock-cycle per century and your perception of time inside the machine would be the same.
Same with low ram, we would have no indication if we were constantly being paged out to a hard drive and written back to ram as required.
Greg Egan gave a great explanation of this in the opening chapter of his Novel Permutation City
Clearly wrong .
Running out of ram happen all the time. We see something, store it, and that something also gets stored in ram. But if that second storage gets reaped by the oom, the universe reprocess it.
Since it’s already in our copy, it cause weird issues. We call it Déjà Vu!
FeynmanFeyrmanI get it now.
Ever walk into a room and forget why you went in there? That’s garbage collection
Given the vastness of space and time, the. Umber of people who die and have yet to learn anything (babies), I’d imagine we’re a system with 32gb of RAM, only consuming a few hundred megabytes.
Besides, I’d imagine that any intelligence capable of constructing and running such a complex simulation would have the ability to scale their system as needed. Using our existing technology, they probably use hot swappable components so that if there is a hardware failure or the need to “download more ram” 🤣 then they can just remove and insert new components on the fly and we’d be none the wiser.
Of course, we being part of the simulation, I’d also wager that unless the creators of said simulation are truly evil and sadistic, we’ll never know because it’s just not part of the programming. And if not we’re, we’d probably already have figured it out by now (beyond guessing and thought experiments). But rest assured, it is fun to think about, in a creepy and existential way.
If we are a simulation, what is the end goal of our creators? Could we be the roadmap for creating anew world in their real life? Maybe they are studying their own history and trying to figure out how their race came into being and evolved over time. Or maybe we are part of a crude video game keeping little Suzie occupied until dinner time. Better yet, maybe Susan is learning about simulations at university and we are part of her post-doctoral thesis.
Why even bother with hot swapping? Just shut down the simulation and turn it back on when you’re done upgrading. No one in the simulation would be able to tell that anything happened.
Continuing with the thought experiment, if you shut it down completely, you’d lose valuable information that was stored in the other ram modules. It’s also reasonable to suggest that resetting the state of such a complex simulation would be more complex (maybe even impossible) and detrimental to the simulation.
Of course another thought just occurred to me: maybe we’re not a computer simulation, but an organic simulation (as in a Petri dish in a lab). Then there would be no reason for ram or hot swappable modules, or any machine parts whatsoever.
It would mean that space is as finite as the Petri dish, but since we’re so small we’d never know it because to us it would be so vast and impossible to reach the edges.
The number of people is 1. It’s me. You guys are all NPCs
Allthat shit you forgot? All that “forgotten” history? There you go.