Hi,
I am reading your another book titled "Neural network and deep learning with Mathematica" and found it very useful, especially the chapters 4 and 5. Mathematica has good documentation for individual layers but I found it is difficult to put various layers and functions together such as specifying inputs and outputs to an entire network as well as individual layer, loss functions for discrete and continuous values, use of NetPort, etc. The chapters 4 and 5 provide good examples on how to put all commands together.
Please, in the future, provide more examples about other layers such as FunctionLayer, RecurrentLayer, AttentionLayer, NetMapOperator, etc .
Thanks for your great work.