Message Boards Message Boards

Black Hole Discussion (based on “[WSS20] Hunting for Black Holes” post)

Posted 3 years ago

Hey guys,

Please make sure to glance at https://community.wolfram.com/groups/-/m/t/2029731 project first. It looks for singularities in WMs and especially the ones that persist for at least 20 steps. Some very useful functions are used that look for the presence of singularities, filter WMs based on that criteria and also look at dimensionality of the system. Their conclusions reflect a disappointment of not finding a change in dimensions as they assume a Schwartz type of BH would have.

Before I dive into physics, let me add that I ran their 21 surviving BH models for a greater number of steps as I summarize in the attached picture (I tried to do 50 steps but some proved too computationally intensive while others I was able to run for 100s of iterations).

Four more models lost their singularities (bringing the total down to 17). Here are the questions we can still answer by looking further into this:

• Which remaining models BHs survive after 100, 200, 300 etc. steps?
• Can models reacquire singularities after losing them?
• If so, we need to map durations of BHs lifetimes and frequency of occurrence.
• Write new function that can identify # of singularities in a given system as well as whether any of them are nested (BH inside another BH).

Now for the physics… Given a tiny # of steps that can be run on these models we are probably looking at vacuum fluctuations on a very small scale. That makes it unlikely to observe any BHs form via gravitational collapse (not enough steps).

What are these singularities then? To me, they look like topological BHs that have nothing to do with gravitational collapse and whose stability depends on the rewriting rules alone. Now imagine that our expanding universe forms these sub-plank BHs that leech some of the spacetime into pocket universes. WMs show that nothing special happens in those regions and that they expand same as everywhere else.

Our own vacuum can have a specific signature of these topological BHs. Average density and duration can not only affect our cosmological constant but also be a dark matter candidate. Moreover, one could try and match one of the WMs to our own universe based on these criteria.

Sooner or later, certain interesting WMs will need to be placed on the server cloud with large number of steps computed and stored to be explored by the community. There is much more to discuss here but it’s probably a good start.

Legend: WM = Wolfram Model | BH = Black Hole | sub-plank = reference to Steven’s belief that these “atoms of space” are much smaller thank plank scale

enter image description here

POSTED BY: Anton Spektorov
8 Replies

Thank you Anton for the excellent post.

As you mentioned there was a difference in the numbers between the pdf and in gigabrain's.

I did some testing for the difference and looks like the reason is the difference in the Wolfram model type.

The example uses the parameter EdgeCountList and the documentation mentions it's "the list of edge counts for each complete generation". And it's the same as edges in the FinalState.

ResourceFunction[ "WolframModel"][{{1, 2, 3}, {2, 4, 5}} -> {{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}, Automatic, 20, {"EdgeCountList"}]

I used the edge count from the LayeredCausalGraph by calculating the Length of the EdgeList.

And the difference is the LayeredCausalGraph has edges 626 and FinalState has edges 630.

We could continue with this more by email. If you send mail to tuomas@gigabrain.io and we can look this in more detail.

Thank you very much,

Tuomas

POSTED BY: Tuomas Sorakivi

Thank you Anton for the through answer.

At that time there was many 3d video generations that took time from the queue and that's why it took so long for the generations to end up showing. Now all of the Wolfram models from the registry have a 3d video also generated and the queue is not that full at the moment.

Generation above 35 have so many edges in the graph that the analysis image and svg file are not produced and this is something I will fix in the following versions. The error showing was because the image file is not found from the analysis stage.

Your point is very interesting about the distributed computation. As your post mentioned I've made changes to have a new action called "Singularity scan". This scans the range and stores the amount of edges of each generation and number of causally connected edges for each compared generations. As you said every generation has to be compared and I made the function towards that.

And the analysis stage can be made only from generations that are possible to compare, that do not have edges above a certain limit.

After that I will look at the calculation of the "causality" parameter as you mentioned.

This weekend I will make more testing and in few days I'll have the scan part ready and I'll write more about that.

Thank you Tuomas

POSTED BY: Tuomas Sorakivi

Hi Tuomas, thanks for your reply and sorry it took me so long to reply back. I played with your new parameter (singularity scan) but unfortunately it is still in queue as I am sure it is very computationally expensive. On the other hand, I have seen the 3D videos that you mentioned that finally finished rendering and they are amazing. It seems that it does cut off after around 5000ish steps but that is more than enough iterations to glimpse some very interesting properties.

In any case, I decided to do "Model 24528" manually so I can compare with what you have gotten in your Gigabrain website and hopefully give some constructive feedback. I am attaching a pdf file of a printed excel table that I used to organize the results but I can provide the wolfram notebook that I used if there is interest. This model produces nesting BHs and my thinking was that if that model can be tackled then others should be a lot easier.

I created an excel table where rows were minimum number of steps and columns maximum. I have followed your example of using 2 edges at a time and also some of the boundary edge comparison was not needed. I noticed that you had some of the edge combinations missing in your singularity scan but mine were basically: 3 to 5-20, 4 to 6-20, 5 to 7-20 etc. with the final comparison being 18-20.

I used:

ResourceFunction[ "WolframModel"][{{1, 2, 3}, {2, 4, 5}} -> {{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}, Automatic, 20, {"EdgeCountList"}]

To get a total edge number for each generation and it seems to differ slightly from your results. It is probable that I missed some adjustment but either way our edges only differ by a few for each generation.

ResourceFunction["CausalConnectionGraph"][Function[{layer}, WolframModel[{{{1, 2, 3}, {2, 4, 5}} ->{{6, 1, 7}, {6, 5, 8}, {7, 5, 3}, {2, 4, 6}}}, Automatic, layer, "CausalGraph"]], 3, 5, VertexLabels -> Automatic]

Is what I used for each possible combination (I had to run it many many times) and it gave me a vertex output in graphics form that was very useful once I got a hang of it. Unfortunately, I had to manually rewrite the picture output into excel table as I was not sure of the precise criteria that will follow.

Legend is:

(x,y) means edges x and y are causally invariant and converge to the same future

x,y means those edges start separate and do not converge by the end. This is the example of divergent edges but does not contain any singularities as those originated in earlier steps

x->y means that future of x coincides with future of y but not vise versa. This seems to mean that singularity originated at that point

Once I filled the table, I created more columns:

Singularity Count: is number of x->y events in a given min-max sample. I made sure to ignore repeated singularities and I think your singularity scan does not ignore duplicates unfortunately.

Singularity Total: is the total count of singularities for each min-max row. Useful in seeing what the max number was at earlier steps. For this model at step 20 the number of unique singularities was 18.

Divergent Edges: this is the number of x->y AND x,y type of events in a given min-max sample. This was harder to count manually and unlike singularities it will partially depend on the format of min-max counting. Still, so long as min-max format is consistent the divergent edges should remain so as well.

Divergent Edges Total: is the total count of divergent edges for each min-max row. Useful in seeing what the max number was at earlier steps. For this model at step 20 the number of unique divergent edges was 139.

Edges Total: is the number of total edges for a given generation.

Causality: is the 1-(divergent edges total/edges total) parameter with 1 being fully causally invariant and 0 being fully divergent.

Causality %: causality in % format.

At step 20 “Model 24528” appears to be 77.9% causal. Interestingly, looking at causality at every step shows that it starts out 95% causal and drops to 77% at step 18 before slowly rising for the last 2 steps.

Here, more useful parameters being to emerge. Derivative of causality can provide an overall trend of the direction the system is taking. Any divergence from that trend can indicate some type of phase transition though in this instance I believe minimum divergence at 18 steps is less interesting. Anything happening at 1000+ steps would be another matter.

I plan on doing other models next but I suspect most will start at a value near 100% and slowly converge to it. One thing I was not able to do due to uniqueness of the selected model is to look at lifetimes of the singularities (since they persist in this case).

Not sure if you could make your singularity scan into a table as well and keep the output of the vertex comparison as non-graphic. Duplicate results would have to be ignored during counting. If you are willing to provide more details on how you run singularity scan I can offer some feedback. Given that there is some overlap in computing perhaps intermediate results can be stored and used by any future singularity scans of higher number of steps. Furthermore, higher number of steps can be made to add to a single output category 1 step at a time. Another words, if someone wants to see singularity scan at 100 steps, but the highest computed so far was only 20 steps then there is only 1 visible category for it and it gets updated slowly from 20 to 100 steps (instead of displaying multiple results of essentially similar thing).

I believe this discussion will lead into even more interesting ideas but I will save them for next time.

Attachments:
POSTED BY: Anton Spektorov

Thank you Anton for the very interesting post. I made a service from searching singularity points from certain generations.

Singularity example from model 3975

Example singularity analysis

You can test finding a singularity point by going to

gigabrain.io Search model

And searching a model with a number id. Then click on Videos button. There you can choose Analyse singularity from the list and enter the generations to examine to the input box in format:

5_20 where the first integer is the minimum generation and the second is the maximum generation.

And click on the "Generate video" button to launch also the singularity analysis. The analysis takes a time and automatically updates the finding to the screen below.

Some generations are taking time to analyse so choose carefully the generation numbers. If the update does not show email to info@gigabrain.io or use lower generations or generations close to each other like 6_7

And like you mentioned the results are also stored to cloud with larger steps and I was planning a method to examine the results and maybe this could be topic for more discussion.

Thank you for the discussion.

POSTED BY: Tuomas Sorakivi

Hi Tuomas, appreciate the feedback!

I did test it on gigabrain this past week and it took me some time to figure out how the requests are processed and where they end up showing. I tried it on 2 WMs (24528 and 3975) and it does take a while (days) for higher number of steps, however if it saves it to a cloud it should not matter in the long run. Unfortunately, generations above 35 seems to give an error…

This XML file does not appear to have any style information associated with it. The document tree is shown below.
<Error>
Code>AccessDenied</Code Message>Access Denied</Message RequestId>JQEG9VE85B4NMT8A</RequestId HostId>kZeWl/i9YBPWU4n0G9GlJ+ODqY83BsuyAxKQZ7F6UHq+kptCjH1yA5RTVeoPjW99m9tsjS0IyuI=</HostId /Error>

Hopefully, this part can be fixed as all the interesting stuff will be happening at later generations. One idea/question I’ve had is the ability to upload computation directly into gigabrain for a particular model. Say I ran a search for singularities for 100 generations (5_100) all day on my computer and it finally gave me the results, perhaps those results can be uploaded directly instead of having to repeat the computation. I can see that some sort of authenticity check might be a good idea where it makes sure that the data matches the model number and a computation type.

As far as a general method of examining these singularities it might end up being a little tricky. Causal invariance that is so often mentioned in this Wolfram Physics Project is basically a claim that any divergence in the causal graph is always temporary. In the view of these singularities, I think the same claim means that none of them will persist for very long. One could come up with a new parameter called “causality” that describes how causally invariant it is at a certain number of generations. A value of 1 would indicate that every generation is causally invariant. Any singularities would then lower that value of 1 depending on how long they persist. That lowered value can be a ratio of Black Hole regions over total number (or rather the inverse of that). So, say a model 24528 was ran for 7 generations and has a total of 18 steps (partial generations). At the 5th generation we have a singularity form at step 7. At the 7th generation we can count total number of steps (18 in our case) and also steps that end up inside that singularity (7). Our “causality” parameter would be (1- 7/18) = 0.61 indicating overall tendency of the causal structure to recombine its steps. If Steven and Jonathan are right, then after large enough number of generations that value will approach 1. Nested singularities present an added difficulty here and perhaps should be treated as extra anti-causal hit to the value of 1. Unfortunately, all of this means that it would require a rigorous analysis of the causal graph. A graph of 100 generations would have to be not only ran as 1 _ 100, 2 _ 100, 3 _ 100….99 _ 100 but also ran at a lower number of steps to determine duration of singularities. That is a lot of computation! It would almost be easier to visually examine it with some sort of an AI algorithm. It would also have been if nice singularity regions could be highlighted even if the singularity itself went away at some later generation.

The most interesting part about all of this is that macroscopic causality appears to emerge from microscopic randomness (randomness in a sense that we cannot pick and choose which update will be applied to our foliation of space). Underlying rule will ultimately determine whether causal structure emerges at all and if it does at what scale (number of generations). Personally, I am hoping for a very tiny but persistent % of singularities at very high generations pointing to some interesting physics.

POSTED BY: Anton Spektorov

And thanks again for the feedback. I made a *Singularity scan that scans for the singularities in a certain range. The range can be inputted as the range in the analysis with

5_20 where the first integer is the minimum scan generation and the second is the maximum scan generation.

That outputs all the combinations within that range as

Connected edges that is the Length of the EdgeList from CausalConnectionGraph.

Edges per generation that is the Length of the EdgeList from LayeredCausalGraph

The Edges per generation could be used filter the combinations that are possible to analyse, where the amount of generations is not that high that the computation takes too long or too much memory.

I will look also at the causality parameter next.

Thanks and have a great week.

Tuomas

POSTED BY: Tuomas Sorakivi

Interesting experiment. Please develop your ideas and publish some results in a journal of physics. Most of my publications are based on experiments made on a personal computer.

I was originally confused by how the BH code worked exactly to where I could not customize it for my purposes.

However, shortly after making this post I discovered that "CausalConnectionGraph" Function is robust enough to give the needed information. The problem I am facing now is attempting to use "WolframModelEvolutionObject" feature together with "CausalConnectionGraph" Function to save time on computation. Any help here would be appreciated, I can't seem to find any examples where those 2 features are used together.

Another thing I found out is that the step # at which comparison between events starts can matter a lot when it comes looking for singularities. While steps preceding step #5 are not necessarily important, later ones can give rise to BHs that only start as such at a certain iteration and not before. That means that a simple search for steps 5 to X is simply not a good way to do it, one must essentially look at every step below the maximum number computed.

I played with a few WPMs this way and found that rule 24528, for example, not only has nested BHs but it appears to be a never ending process. Instead of turtles all the way down it could be Black Holes all the way down.

Wolfram mentioned early on that causal invariance basically depends on multiple paths reconverging eventually. Well, by that definition, WMs that have persistent BHs are not causally invariant and theories such as GR would not work in that case. I am not sure it is absolutely true as you can have a mostly causally invariant system in the continuum limit while still persistently generate small BH patches that never resolve themselves. In fact, concepts of computational reducibility/irreducibility could be directly related to it.

Lee Smolin has a theory of cosmological evolution that contains nested BHs but I do not think this is the same thing, however, unless every nested BH will somehow experience big bang as well. Dark Matter, on the other hand, could be made of these tiny pin pricks of curved space, that is able to clump together with others of its kind.

I will be more than happy to further develop this though I could use some help with the code and was also hoping for some sort of a discussion on the subject.

POSTED BY: Anton Spektorov
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract