Hi Tuomas, thanks for your reply and sorry it took me so long to reply back. I played with your new parameter (singularity scan) but unfortunately it is still in queue as I am sure it is very computationally expensive. On the other hand, I have seen the 3D videos that you mentioned that finally finished rendering and they are amazing. It seems that it does cut off after around 5000ish steps but that is more than enough iterations to glimpse some very interesting properties.
In any case, I decided to do "Model 24528" manually so I can compare with what you have gotten in your Gigabrain website and hopefully give some constructive feedback. I am attaching a pdf file of a printed excel table that I used to organize the results but I can provide the wolfram notebook that I used if there is interest. This model produces nesting BHs and my thinking was that if that model can be tackled then others should be a lot easier.
I created an excel table where rows were minimum number of steps and columns maximum. I have followed your example of using 2 edges at a time and also some of the boundary edge comparison was not needed. I noticed that you had some of the edge combinations missing in your singularity scan but mine were basically: 3 to 5-20, 4 to 6-20, 5 to 7-20 etc. with the final comparison being 18-20.
I used:
ResourceFunction[ "WolframModel"][{{1, 2, 3}, {2, 4, 5}} -> {{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}, Automatic, 20, {"EdgeCountList"}]
To get a total edge number for each generation and it seems to differ slightly from your results. It is probable that I missed some adjustment but either way our edges only differ by a few for each generation.
ResourceFunction["CausalConnectionGraph"][Function[{layer}, WolframModel[{{{1, 2, 3}, {2, 4, 5}} ->{{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}}, Automatic, layer, "CausalGraph"]], 3, 5, VertexLabels -> Automatic]
Is what I used for each possible combination (I had to run it many many times) and it gave me a vertex output in graphics form that was very useful once I got a hang of it. Unfortunately, I had to manually rewrite the picture output into excel table as I was not sure of the precise criteria that will follow.
Legend is:
(x,y) means edges x and y are causally invariant and converge to the same future
x,y means those edges start separate and do not converge by the end. This is the example of divergent edges but does not contain any singularities as those originated in earlier steps
x->y means that future of x coincides with future of y but not vise versa. This seems to mean that singularity originated at that point
Once I filled the table, I created more columns:
Singularity Count: is number of x->y events in a given min-max sample. I made sure to ignore repeated singularities and I think your singularity scan does not ignore duplicates unfortunately.
Singularity Total: is the total count of singularities for each min-max row. Useful in seeing what the max number was at earlier steps. For this model at step 20 the number of unique singularities was 18.
Divergent Edges: this is the number of x->y AND x,y type of events in a given min-max sample. This was harder to count manually and unlike singularities it will partially depend on the format of min-max counting. Still, so long as min-max format is consistent the divergent edges should remain so as well.
Divergent Edges Total: is the total count of divergent edges for each min-max row. Useful in seeing what the max number was at earlier steps. For this model at step 20 the number of unique divergent edges was 139.
Edges Total: is the number of total edges for a given generation.
Causality: is the 1-(divergent edges total/edges total) parameter with 1 being fully causally invariant and 0 being fully divergent.
Causality %: causality in % format.
At step 20 “Model 24528” appears to be 77.9% causal. Interestingly, looking at causality at every step shows that it starts out 95% causal and drops to 77% at step 18 before slowly rising for the last 2 steps.
Here, more useful parameters being to emerge. Derivative of causality can provide an overall trend of the direction the system is taking. Any divergence from that trend can indicate some type of phase transition though in this instance I believe minimum divergence at 18 steps is less interesting. Anything happening at 1000+ steps would be another matter.
I plan on doing other models next but I suspect most will start at a value near 100% and slowly converge to it. One thing I was not able to do due to uniqueness of the selected model is to look at lifetimes of the singularities (since they persist in this case).
Not sure if you could make your singularity scan into a table as well and keep the output of the vertex comparison as non-graphic. Duplicate results would have to be ignored during counting. If you are willing to provide more details on how you run singularity scan I can offer some feedback. Given that there is some overlap in computing perhaps intermediate results can be stored and used by any future singularity scans of higher number of steps. Furthermore, higher number of steps can be made to add to a single output category 1 step at a time. Another words, if someone wants to see singularity scan at 100 steps, but the highest computed so far was only 20 steps then there is only 1 visible category for it and it gets updated slowly from 20 to 100 steps (instead of displaying multiple results of essentially similar thing).
I believe this discussion will lead into even more interesting ideas but I will save them for next time.
Attachments: