Hi -- As a stopgap until 11.1 when I can use my 1080 GPU, I've switched to a machine with a 970, but am having a different problem. I can train a simple (stripped down parameters) network on the CPU, but when I add the GPU as the target, it starts to run and then flashes, stops, and turns most of the previous computations back to undefined.
For reference, I have successfully run this same code targeting a Quadro 5000M GPU, where it worked perfectly.
I hope it is something simple that I need to do differently, but I don't know what to try next. I tried uninstalling and re-installing the display driver, but that didn't seem to matter. I'm currently using the latest version from Nvidia/EVGA for my card (an FTW version, with all factory settings).
I've attached a notebook that includes a call to SystemInformation, as well as the CPU & GPU training calls.
Thanks for any thoughts! -- David