Here is an example of what works. There is some book keeping up front to define the sparse initialization array which might distract from the main point, but I don't have time right now to make a nicer example. I don't need to freeze a whole layer, just part of it. What I need is actually a non-linear convolution, so only certain parameters need to be frozen in the convolution kernel. Also, is NetArray the only way to add user-specified learnable parameters?