I have a few GPU/CPU 'loaded' Linux boxes in my lab, used largely for rendering. I want to compare the new 11.1 Neural Network stuff to some Tensorflow things we do here. Of course, since I have all that iron back there I want to use it.
I don't want to have to actually walk all that way, and besides, they are headless (or head-impaired).
I haven't tried using remote kernels for anything other than Parallel
stuff in quite a while (like, 1990 or so, with a Cray). I've noticed that the 'Basic' settings in Evaluation->Kernel Configuration Options ...
don't work so I had to go and manually get a connection to happen, thanks to this SE answer I was able to hobble something together that works.
Except -
Things that contain Dynamic
content aren't happy. To wit, here's an example, first evaluated on the Local machine, then evaluated on the remote -
The resulting NetChain
isn't formatted when returning from the remote kernel. The error -
A GridBox with an invalid first argument was encountered. The first argument to GridBox must be a rectangular matrix (nested list).
makes a little bit of sense, spelunking the thing returned -
DynamicBox[GridBox[{{
NeuralNetworks`Private`NetChain`MouseClickBoxes[...
suggests to me that there are things that don't get faithfully interpreted by the front-end's kernel when they come back from the remote.
This wouldn't be a problem, other than I would like to run the learning remotely and observe the progress on the front end. This is further reinforced to me by the fact that, if I cut-and-paste the 'red-box' output from the remote-kernel'd notebook into a local-kernel notebook, it renders properly.
Any thoughts on how to get the local notebook kernel to jibe with the output of the remote kernel?
(NB- there is a sort of vague response to my post on SE that says "this has been broken since 11.1" which, if true, maybe explains it. But I'm posting here to see if there's any official feedback.)