There are many definitions of entropy. In general, the intuitive idea is that the entropy of a macrostate is proportional to the logarithm of the number of microstates consistent with the (macroscopic) properties of this macrostate. Consider an object (macrostate) whose microstates are spatial hypergraphs. Then the entropy of this object, with respect to the Wolfram Model that describes it, is proportional to the logarithm of the number of spatial hypergraphs consistent with the macroscopic properties of this object. For example, in a Wolfram Model of String Theory, the entropy of a given string (macrostate) is proportional to the number of spatial hypergraphs consistent with the properties of this string.