# Message Boards

Answer
(Unmark)

GROUPS:

6

# Comparing common dimension estimators for hypergraph

Posted 3 months ago

Comparing common dimension estimators for hypergraph by Rauan Kaldybaev The Wolfram Model is a theory that models the universe by a discrete hypergraph (a generalization of a graph where an edge can connect multiple vertices). The hypergraph evolves in discrete time steps according to certain rules - laws of physics. One of the key assumptions of the theory is that as time increases, the discrete hypergraph starts to exhibit the properties of continuous space. In particular, the effective dimension of the hypergraph should approach 3. The question of dimension is thus very important in the model. There are many ways we could estimate the dimension of a hypergraph, but the most common are Myrheim-Meyer estimators, Wolfram-Hausdorff estimators, and midpoint scaling estimators. The current computational essay compares the estimators by their speed of convergence and numerical robustness. To make the comparisons, it uses wm1268, wm2224, wm3255 and wm1811 from the Registry of Notable Universes. It appears that the Wolfram-Hausdorff dimension estimator is the best of the three, converging fastest and being the most robust. In[]:= Show[wm1811[200]["FinalStatePlot"],ImageSizeLarge] Out[]=
Getting started
Source code for the dimension estimators The current subsection contains the source code for the dimension estimators. Jonathan Gorard was kind to share the code for Myrheim-Meyer and Wolfram-Hausdorff dimension estimators. The code for midpoint scaling estimator I’ve wrote myself.
Myrheim-Meyer dimension estimator
Wolfram-Hausdorff dimension estimator
Midpoint scaling estimator
Shortcuts To compare the three dimension estimators, we can define functions that would take in a WolframModelEvolutionObject and return an estimate for its dimension: In[]:= ClearAll[mmDim, whDim, msDim];mmDim[wm_] := MyrheimMeyerDimensionEstimator@wm["CausalGraph"];whDim[wm_] := WolframHausdorffDimensionEstimator[ResourceFunction["HypergraphToGraph"][wm["FinalState"]], "Dimension"];msDim[wm_] := midpointScalingEstimator@wm["CausalGraph"];
Preparing the universes Although most hypergraphs don’t resemble any n n In[]:= ClearAll[wm2224, wm1268, wm3255, wm1811];wm2224[n_] := ResourceFunction["WolframModel"][{{{1, 2}, {2, 3}} {{4, 2}, {2, 4}, {4, 3}, {1, 4}}}, {{1, 1}, {1, 1}}, n];wm1268[n_] := ResourceFunction["WolframModel"][{{{1, 1, 2}, {3, 4, 2}} {{4, 4, 2}, {1, 5, 2}, {1, 5, 3}}}, {{1, 1, 1}, {1, 1, 1}}, n];wm3255[n_] := ResourceFunction["WolframModel"][{{{1, 2, 1}, {1, 3, 4}} {{4, 5, 4}, {5, 4, 2}, {3, 2, 5}}}, {{1, 1, 1}, {1, 1, 1}}, n];wm1811[n_] := ResourceFunction["WolframModel"][{{{1, 1, 2}, {1, 3, 4}} {{4, 4, 3}, {2, 5, 3}, {2, 5, 3}}}, {{1, 1, 1}, {1, 1, 1}}, n];
Comparing point estimates To compare the algorithms for dimension estimation, the most obvious thing to do is to compare the values they output for some selected inputs. This would allow us to make conclusions about the characteristics and speed of convergence of each of the algorithms. Based on the computational experiments below, it can be concluded that the Wolfram-Hausdorff dimension estimator and the midpoint scaling estimator converge faster than Myrheim-Meyer. Although the three algorithms generally agree on what the dimension of a particular object is, in some cases they give drastically different answers because their working principle is very different. A good example is wm2224, which is basically a chain of vertices forming a circle. The Wolfram-Hausdorff estimator looks at the structure of the hypergraph and decides that the universe is one-dimensional. Myrheim-Meyer, however, looks at the structure of the causal graph, which is a tree graph, and thinks that the universe has no definite dimension. In this sense the Wolfram-Hausdorff estimator can be preferable because it operates with more concrete objects. The curvature midpoint estimator also looks at the structure of the causal graph, but miraculously it still manages to give the correct answer 1.
wm1268 First, run the dimension estimators on wm1268: In[]:= Show[{ListLinePlot[Join@Transpose@Table[Block[{wm=wm1268[n]},{{n,mmDim[wm]},{n,whDim[wm]},{n,msDim[wm]}}],{n,1,40}],PlotLegends{"Myrheim-Meyer","Wolfram-Hausdorff","Midpoint scaling"},AxesLabel{"time","dim"},PlotLabel"wm1268",PlotRangeAll],Plot[1,{n,1,40},PlotStyle{Thin,Dashed,Black}]}] Out[]=
Here, the vertical axis is the estimate for dimension and the horizontal axis is the number of evolution discrete time steps. Although the estimators give crazy results for small n In[]:= mmDimErrData1268 = Table[{n, 1 - mmDim[wm1268[n]]}, {n, 25, 205, 20}];whDimErrData1268 = Table[{n, 1 - whDim[wm1268[n]]}, {n, 25, 205, 20}];msDimErrData1268 = Table[{n, 1 - msDim[wm1268[n]]}, {n, 25, 105, 20}]; In[]:= ListLinePlot[Join[{mmDimErrData1268,whDimErrData1268,msDimErrData1268}],PlotLegends{"Myrheim-Meyer","Wolfram-Hausdorff","Midpoint scaling"},AxesLabel{"time","dim"},PlotLabel"wm1268",PlotRangeAll] Out[]=
Here, the plot of error for the midpoint scaling estimator stops at 105 because of its slow implementation. It can be seen that while Myrheim-Meyer and Wolfram-Hausdorff estimators converge at approximately the same speed, the midpoint scaling estimator actually seems to be doing significantly better. Also, the error of Myrheim-Meyer and Wolfram-Hausdorff estimators is amazingly accurately modeled by a /n In[]:= Block[{model=Normal@LinearModelFit[Join[mmDimErrData1268,whDimErrData1268],1/n,n,IncludeConstantBasisFalse]},Legended[Show[{ListLinePlot[Join[{mmDimErrData1268,whDimErrData1268}],AxesLabel{"time","dim"},PlotLabel"wm1268",PlotRangeAll],Plot[model,{n,Min@mmDimErrData1268[[All,1]],Max@mmDimErrData1268[[All,1]]},PlotStyle{Thick,Dashed,Red},PlotRangeAll]}],LineLegend[{ColorData[97][1],ColorData[97][2],Red},{"Myrheim-Meyer","Wolfram-Hausdorff",Row[{"Model: ",model}]}]]] Out[]=
If the trend continues, the error should approach -4 10 In[]:= wm1268[#]["FinalStatePlot"]&/@{1,50,100,200,300} Out[]= , , , ,
wm2224 Run the dimension estimators on wm2224: In[]:= ListLinePlot[Join@Transpose@Table[Block[{wm=wm2224[n]},{{n,mmDim[wm]},{n,whDim[wm]},{n,msDim[wm]}}],{n,1,6}],PlotLegends{"Myrheim-Meyer","Wolfram-Hausdorff","Midpoint scaling"},AxesLabel{"time","dim"},PlotLabel"wm2224"] Out[]=
This is a very peculiar universe. The Wolfram-Hausdorff estimator and midpoint scaling estimator predict the dimension to be 1, whereas the dimension estimate given by Myrheim-Meyer appears to be growing linearly with time. The difference here is due to the fact that the Wolfram-Hausdorff dimension estimator looks at the structure of the actual hypergraph, while the Myrheim-Meyer estimator looks at the associated causal graph, which, roughly speaking, tells us the order at which the vertices were created. To see this, consider the figures below: In[]:= wm2224[#]["FinalStatePlot"]&/@{1,2,3,4,5} Out[]= , , , , In[]:= wm2224[#]["CausalGraph"]&/@{1,2,3,4,5} Out[]= , , , , While the state plot is a one-dimensional circle, the causal graph is a binary tree graph. If we denote N t N= t 2 d N LogN=dLogt+ Log[ N t+1 N t t Log[t+1]-Log[t]≈ d dt d≈tLog N t+1 N t d d≈Log[2]t In[]:= Legended[Show[{ListLinePlot[Table[{n,mmDim[wm2224[n]]},{n,1,11}],AxesLabel{"time","dim"},PlotLabel"wm2224"],Plot[Log[2]t,{t,1,11},PlotStyle{ColorData[97][2]}]},PlotRangeAll],LineLegend[{ColorData[97][1],ColorData[97][2]},{"Actual MM estimate","Log[2] t"}]] Out[]=
Interestingly, although the midpoint scaling estimator, just like Myrheim-Meyer, looks at the structure of the causal graph, it correctly predicts the dimension of wm2224 to be about 1. This is because the midpoint scaling estimator works by computing the size of causal intervals. And when it computes the causal interval between the tree’s root and one of its leafs, it doesn’t look at the various branches of the tree and only sees one one-dimensional line.
wm3255 The next in line is wm3255. Again, the dimension is 1. Wolfram-Hausdorff estimator outputs exactly 1 from the very beginning, the midpoint scaling estimator also gives approximately 1, andMyrheim-Meyer gives an estimate that gradually approaches 1: In[]:= Show[{ListLinePlot[Join@Transpose@Table[Block[{wm=wm3255[n]},{{n,mmDim[wm]},{n,whDim[wm]},{n,msDim[wm]}}],{n,1,40}],PlotLegends{"Myrheim-Meyer","Wolfram-Hausdorff","Midpoint scaling"},AxesLabel{"time","dim"},PlotLabel"wm3255",PlotRangeAll],Plot[1,{n,1,40},PlotStyle{Thin,Dashed,Black}]}] Out[]=
This might be somewhat surprising looking at the graph, which, from the first sight, appears to be a two-dimensional grid: In[]:= wm3255[1000]["FinalStatePlot"] Out[]=
wm1811 Running the algorithms on wm1811, we get a similar result: In[]:= Show[{ListLinePlot[Join@Transpose@Table[Block[{wm=wm1811[n]},{{n,mmDim[wm]},{n,whDim[wm]},{n,msDim[wm]}}],{n,1,40}],PlotLegends{"Myrheim-Meyer","Wolfram-Hausdorff","Midpoint scaling"},AxesLabel{"time","dim"},PlotLabel"wm1811",PlotRangeAll],Plot[1,{n,1,40},PlotStyle{Thin,Dashed,Black}]}] Out[]=
Most of the times, the Wolfram-Hausdorff estimator outputs exactly 1, and the midpoint scaling estimator gives a result that is close to 1. The Myrheim-Meyer estimator again gives an estimate that approaches 1 as time increases. This is somewhat surprising considering that the state plot looks like a cone, so we would expect it to be two-dimensional: In[]:= wm1811[600]["FinalStatePlot"] Out[]=
Comparing numerical robustness Consider a Wolfram Model object with a large number of vertices and edges that has a certain definite dimension. Now, if we add one more vertex, we expect the dimension of the overall object to not change significantly. This must also be true of dimension estimators: if we add an extra vertex to the graph, the output of the dimension estimators should not change much. In other words, we want the dimension estimators to be robust. We can test the numerical robustness of the algorithms using the method that has just been described: first, compute an estimate for the dimension of a given graph, then add an extra vertex to it, and estimate the dimension of the resulting graph. Doing this for all possible points where the new vertex can be inserted, we come up with a distribution for the differences in dimension estimates. The more robust the algorithm is, the closer to zero will the distribution be. So, we can use the root mean square of the difference distribution to measure the robustness of the algorithms. Based on this analysis, it appears that the most robust estimator is the Wolfram-Hausdorff, followed shortly by midpoint scaling. Myrheim-Meyer seems to be the least robust of the three. Here is what the distribution of differences looks like for the three estimators for wm1268 after 60 time steps (see source code at the bottom of the section): In[]:= Histogram[#,7,PlotLabelRow[{"RMS = ",NumberForm[RootMeanSquare[#],3]}]]&/@({mmDiffDist[#],whDiffDist[#],msDiffDist[#]}&@wm1268[60]) Out[]= , , The histograms correspond to Myrheim-Meyer, Wolfram-Hausdorff, and midpoint scaling estimators, respectively. The midpoint scaling estimator appears to be the most robust, followed by Wolfram-Hausdorff, and then the Myrheim-Meyer estimator, which seems to be the least robust of the three algorithms. The same pattern hold with wm3255 evolved for 30 time steps and wm1811 evolved for 30 time steps: In[]:= Histogram[#,7,PlotLabelRow[{"RMS = ",NumberForm[RootMeanSquare[#],3]}]]&/@({mmDiffDist[#],whDiffDist[#],msDiffDist[#]}&@wm3255[30] |