I buy Davids premise of universality of computing but I fear there may be physical limits of binary computing i.e. the amount of matter in the universe and the time before its heat death. lets say I am modeling the interactions of a few cubic meters of hydrogen
5.4199274e+25 particles I need to know their location 3 variables orientation 3 more, speed, temp, etc so maybe only 10 things about each h2 pair.
Because this sort of thing is very sensitive to initial conditions and small errors in calculation lead to big differences later a "Double" (64-bits) just is not enough. Easy I will use 512-bits (Moore's law and all) . . . 5120 bits per pair * number of particles = 2.7750028e+29
so only about 2.8e+17 terabits or 3.5e+16 terabytes of storage needed for one
step.
Seagate promises 60 terabyte 3.5-inch hard drives within 10 years
5.8e+14 of these seagate hard drives should do the trick.
that is only 3.22233e10 miles of hard drives or 346.65 Astronomical Units
(the distance from the sun to pluto is 29) of course placing the hard drives in a straight line might be a mistake. so we will stack them in a 2.40822 mile cube. just for an instant of data we are at 58215044860:1 in digital storage.
volume wise. but hell with an earth sized hard drive we could describe 107963558147 cubic meters of hydrogen (a sphere with a diameter of 1.476 km) wait do we have to do some computing as well?
sorry about a arithmetic rant.