Very nice! I like non-NN uses of the neural network functions :)
I made a simplified version of this concept:
Set some constants regarding the number of iterations, the frame bounds, and the resolution:
dims = {1000, 1000};
bounds = {{-2, 1}, {-1.5, 1.5}};
iterations = 200;
Create a network that runs one iteration:
stepnet=NetGraph[
<|
"re"->PartLayer[1],
"im"->PartLayer[2],
"sqre"->ThreadingLayer[#1^2-#2^2&],
"sqim"->ThreadingLayer[2*#1*#2&],
"catenate"->CatenateLayer[],
"reshape"->ReshapeLayer[Prepend[dims,2]],
"c"->ConstantPlusLayer["Biases"->{
ConstantArray[N@Range[dims[[2]]]/dims[[2]]*(bounds[[1,2]]-bounds[[1,1]])+bounds[[1,1]],dims[[1]]],
Transpose@ConstantArray[N@Range[dims[[1]]]/dims[[1]]*(bounds[[2,2]]-bounds[[2,1]])+bounds[[2,1]],dims[[2]]]}],
"clip"->ElementwiseLayer[Clip[#,{-10000,10000}]&]
|>,{
NetPort["Input"]->{"re","im"}->{"sqre","sqim"}->"catenate"->"reshape"->"c"->"clip"->NetPort["Output"]
},"Input"->Prepend[dims,2]]
Then create a network that runs the single-step-network repeatedly and calculates the squared norm at each resulting point:
net=NetGraph[<|
"init"->ConstantArrayLayer["Array"->ConstantArray[0,Prepend[dims,2]]],
"steps"->NetNestOperator[stepnet,iterations],
"re"->PartLayer[1],
"im"->PartLayer[2],
"normsq"->ThreadingLayer[#1^2+#2^2&]
|>,
{"init"->"steps"->{"re","im"}->"normsq"->NetPort["Output"]}
]
Finally, evaluate the network:
Image@UnitStep[2^2 - net[]]

Attachments: