Group Abstract Group Abstract

Message Boards Message Boards

Black Hole Discussion (based on “[WSS20] Hunting for Black Holes” post)

Posted 4 years ago

Hey guys,

Please make sure to glance at https://community.wolfram.com/groups/-/m/t/2029731 project first. It looks for singularities in WMs and especially the ones that persist for at least 20 steps. Some very useful functions are used that look for the presence of singularities, filter WMs based on that criteria and also look at dimensionality of the system. Their conclusions reflect a disappointment of not finding a change in dimensions as they assume a Schwartz type of BH would have.

Before I dive into physics, let me add that I ran their 21 surviving BH models for a greater number of steps as I summarize in the attached picture (I tried to do 50 steps but some proved too computationally intensive while others I was able to run for 100s of iterations).

Four more models lost their singularities (bringing the total down to 17). Here are the questions we can still answer by looking further into this:

• Which remaining models BHs survive after 100, 200, 300 etc. steps?
• Can models reacquire singularities after losing them?
• If so, we need to map durations of BHs lifetimes and frequency of occurrence.
• Write new function that can identify # of singularities in a given system as well as whether any of them are nested (BH inside another BH).

Now for the physics… Given a tiny # of steps that can be run on these models we are probably looking at vacuum fluctuations on a very small scale. That makes it unlikely to observe any BHs form via gravitational collapse (not enough steps).

What are these singularities then? To me, they look like topological BHs that have nothing to do with gravitational collapse and whose stability depends on the rewriting rules alone. Now imagine that our expanding universe forms these sub-plank BHs that leech some of the spacetime into pocket universes. WMs show that nothing special happens in those regions and that they expand same as everywhere else.

Our own vacuum can have a specific signature of these topological BHs. Average density and duration can not only affect our cosmological constant but also be a dark matter candidate. Moreover, one could try and match one of the WMs to our own universe based on these criteria.

Sooner or later, certain interesting WMs will need to be placed on the server cloud with large number of steps computed and stored to be explored by the community. There is much more to discuss here but it’s probably a good start.

Legend: WM = Wolfram Model | BH = Black Hole | sub-plank = reference to Steven’s belief that these “atoms of space” are much smaller thank plank scale

enter image description here

POSTED BY: Anton Spektorov
8 Replies

And thanks again for the feedback. I made a *Singularity scan that scans for the singularities in a certain range. The range can be inputted as the range in the analysis with

5_20 where the first integer is the minimum scan generation and the second is the maximum scan generation.

That outputs all the combinations within that range as

Connected edges that is the Length of the EdgeList from CausalConnectionGraph.

Edges per generation that is the Length of the EdgeList from LayeredCausalGraph

The Edges per generation could be used filter the combinations that are possible to analyse, where the amount of generations is not that high that the computation takes too long or too much memory.

I will look also at the causality parameter next.

Thanks and have a great week.

Tuomas

POSTED BY: Tuomas Sorakivi

Thank you Anton for the through answer.

At that time there was many 3d video generations that took time from the queue and that's why it took so long for the generations to end up showing. Now all of the Wolfram models from the registry have a 3d video also generated and the queue is not that full at the moment.

Generation above 35 have so many edges in the graph that the analysis image and svg file are not produced and this is something I will fix in the following versions. The error showing was because the image file is not found from the analysis stage.

Your point is very interesting about the distributed computation. As your post mentioned I've made changes to have a new action called "Singularity scan". This scans the range and stores the amount of edges of each generation and number of causally connected edges for each compared generations. As you said every generation has to be compared and I made the function towards that.

And the analysis stage can be made only from generations that are possible to compare, that do not have edges above a certain limit.

After that I will look at the calculation of the "causality" parameter as you mentioned.

This weekend I will make more testing and in few days I'll have the scan part ready and I'll write more about that.

Thank you Tuomas

POSTED BY: Tuomas Sorakivi

Thank you Anton for the very interesting post. I made a service from searching singularity points from certain generations.

Singularity example from model 3975

Example singularity analysis

You can test finding a singularity point by going to

gigabrain.io Search model

And searching a model with a number id. Then click on Videos button. There you can choose Analyse singularity from the list and enter the generations to examine to the input box in format:

5_20 where the first integer is the minimum generation and the second is the maximum generation.

And click on the "Generate video" button to launch also the singularity analysis. The analysis takes a time and automatically updates the finding to the screen below.

Some generations are taking time to analyse so choose carefully the generation numbers. If the update does not show email to info@gigabrain.io or use lower generations or generations close to each other like 6_7

And like you mentioned the results are also stored to cloud with larger steps and I was planning a method to examine the results and maybe this could be topic for more discussion.

Thank you for the discussion.

POSTED BY: Tuomas Sorakivi
POSTED BY: Anton Spektorov

Thank you Anton for the excellent post.

As you mentioned there was a difference in the numbers between the pdf and in gigabrain's.

I did some testing for the difference and looks like the reason is the difference in the Wolfram model type.

The example uses the parameter EdgeCountList and the documentation mentions it's "the list of edge counts for each complete generation". And it's the same as edges in the FinalState.

ResourceFunction[ "WolframModel"][{{1, 2, 3}, {2, 4, 5}} -> {{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}, Automatic, 20, {"EdgeCountList"}]

I used the edge count from the LayeredCausalGraph by calculating the Length of the EdgeList.

And the difference is the LayeredCausalGraph has edges 626 and FinalState has edges 630.

We could continue with this more by email. If you send mail to tuomas@gigabrain.io and we can look this in more detail.

Thank you very much,

Tuomas

POSTED BY: Tuomas Sorakivi

Hi Tuomas, thanks for your reply and sorry it took me so long to reply back. I played with your new parameter (singularity scan) but unfortunately it is still in queue as I am sure it is very computationally expensive. On the other hand, I have seen the 3D videos that you mentioned that finally finished rendering and they are amazing. It seems that it does cut off after around 5000ish steps but that is more than enough iterations to glimpse some very interesting properties.

In any case, I decided to do "Model 24528" manually so I can compare with what you have gotten in your Gigabrain website and hopefully give some constructive feedback. I am attaching a pdf file of a printed excel table that I used to organize the results but I can provide the wolfram notebook that I used if there is interest. This model produces nesting BHs and my thinking was that if that model can be tackled then others should be a lot easier.

I created an excel table where rows were minimum number of steps and columns maximum. I have followed your example of using 2 edges at a time and also some of the boundary edge comparison was not needed. I noticed that you had some of the edge combinations missing in your singularity scan but mine were basically: 3 to 5-20, 4 to 6-20, 5 to 7-20 etc. with the final comparison being 18-20.

I used:

ResourceFunction[ "WolframModel"][{{1, 2, 3}, {2, 4, 5}} -> {{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}, Automatic, 20, {"EdgeCountList"}]

To get a total edge number for each generation and it seems to differ slightly from your results. It is probable that I missed some adjustment but either way our edges only differ by a few for each generation.

ResourceFunction["CausalConnectionGraph"][Function[{layer}, WolframModel[{{{1, 2, 3}, {2, 4, 5}} ->{{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}}, Automatic, layer, "CausalGraph"]], 3, 5, VertexLabels -> Automatic]

Is what I used for each possible combination (I had to run it many many times) and it gave me a vertex output in graphics form that was very useful once I got a hang of it. Unfortunately, I had to manually rewrite the picture output into excel table as I was not sure of the precise criteria that will follow.

Legend is:

(x,y) means edges x and y are causally invariant and converge to the same future

x,y means those edges start separate and do not converge by the end. This is the example of divergent edges but does not contain any singularities as those originated in earlier steps

x->y means that future of x coincides with future of y but not vise versa. This seems to mean that singularity originated at that point

Once I filled the table, I created more columns:

Singularity Count: is number of x->y events in a given min-max sample. I made sure to ignore repeated singularities and I think your singularity scan does not ignore duplicates unfortunately.

Singularity Total: is the total count of singularities for each min-max row. Useful in seeing what the max number was at earlier steps. For this model at step 20 the number of unique singularities was 18.

Divergent Edges: this is the number of x->y AND x,y type of events in a given min-max sample. This was harder to count manually and unlike singularities it will partially depend on the format of min-max counting. Still, so long as min-max format is consistent the divergent edges should remain so as well.

Divergent Edges Total: is the total count of divergent edges for each min-max row. Useful in seeing what the max number was at earlier steps. For this model at step 20 the number of unique divergent edges was 139.

Edges Total: is the number of total edges for a given generation.

Causality: is the 1-(divergent edges total/edges total) parameter with 1 being fully causally invariant and 0 being fully divergent.

Causality %: causality in % format.

At step 20 “Model 24528” appears to be 77.9% causal. Interestingly, looking at causality at every step shows that it starts out 95% causal and drops to 77% at step 18 before slowly rising for the last 2 steps.

Here, more useful parameters being to emerge. Derivative of causality can provide an overall trend of the direction the system is taking. Any divergence from that trend can indicate some type of phase transition though in this instance I believe minimum divergence at 18 steps is less interesting. Anything happening at 1000+ steps would be another matter.

I plan on doing other models next but I suspect most will start at a value near 100% and slowly converge to it. One thing I was not able to do due to uniqueness of the selected model is to look at lifetimes of the singularities (since they persist in this case).

Not sure if you could make your singularity scan into a table as well and keep the output of the vertex comparison as non-graphic. Duplicate results would have to be ignored during counting. If you are willing to provide more details on how you run singularity scan I can offer some feedback. Given that there is some overlap in computing perhaps intermediate results can be stored and used by any future singularity scans of higher number of steps. Furthermore, higher number of steps can be made to add to a single output category 1 step at a time. Another words, if someone wants to see singularity scan at 100 steps, but the highest computed so far was only 20 steps then there is only 1 visible category for it and it gets updated slowly from 20 to 100 steps (instead of displaying multiple results of essentially similar thing).

I believe this discussion will lead into even more interesting ideas but I will save them for next time.

Attachments:
POSTED BY: Anton Spektorov
POSTED BY: Anton Spektorov

Interesting experiment. Please develop your ideas and publish some results in a journal of physics. Most of my publications are based on experiments made on a personal computer.

Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard