Moore's law predicts that compared to 1980, computers in 2040 would be a BILLION times faster.
Also that compared to 1994 computers, the ones rolling out now are a MILLION times faster.
A cheap Raspberry PI would easily be able to handle the computational workload of a room full of equipment in 1984.
What would have taken a million years to calculate in 1984 would theoretically take 131 hours today and 29 seconds in 2044...
Yes yes single threaded execution etc but now we just build a crap ton more and keep increasing the computational throughput per watt etc.
We've moved massive calculations into GPUs and thus in terms computational capabilities it holds up.
I mean check this out https://en.wikipedia.org/wiki/FLOPS
The geometric growth is real. Moore's law was just one way to explain it.