Assume that our Universe is described by a Wolfram Model. We want to estimate the number of updates per second. In order to do that, we propose the following experiment.
Assume that we can observe an aperiodic physical system involving elementary particles (in a lab or via astronomical methods) such that we can estimate its descriptive complexity at any given time (a periodic system with a sufficiently large period can also be used for this experiment). Suppose that during a time interval of dt seconds the system increased its descriptive complexity in dK bits (in general, the exact value is uncomputable, but we only want an estimation). Then, using the theorem from my previous post logarithm of time = complexity, we can express the number of updates per second as the quotient
updates per second = 2^(dK) / dt.
Notice that, in a time interval dt, our aperiodic system increases its descriptive complexity the same amount as the whole Universe. This claim follows from the fact that in both cases, the only information needed in order to reconstruct their states at time dt after the initial conditions is precisely the descriptive complexity of the product of dt times the number of updates per second. Therefore, the formula above also provides an estimation of the number of updates per second of the Universe.
Before ending this post, I would like to point out that the method described here is not for an exact calculation, even if the premises are verified. This method is for estimation and it is not clear what is the bound of the error of measurement using the approach that we proposed.
I subject related to the present discussion, but that is not exactly the same, is the physical limit of computation. In this direction, I recommend Seth Lloyd's paper Ultimate physical limits to computation.