Message Boards Message Boards

Memory leak in neural network functionality?

Posted 2 years ago

Hello, has anyone figured out how to actually delete a neural network from GPU memory without restarting wolfram desktop? There appears to be a pretty severe memory leak associated with using Neural Network functionality in WL.

For example,

$HistoryLength = 0
MemoryInUse[]
(* 68,363,752 *)

net =  NetModel[
  "Multi-scale Context Aggregation Net Trained on Cityscapes Data"]
mask = net[ImageResize[CurrentImage[], {128, 128}], 
  TargetDevice -> "GPU"]
MemoryInUse[]
(* 746866704 *)

ClearAll[mask, net]
ClearSystemCache[]
MemoryInUse[]
(* 746566120 *)

GPU memory consumption

POSTED BY: Alec Graves
2 Replies
Posted 2 years ago

This is weird:

NeuralNetworks`MemoryUsageInfoString[]
"S:16,992M K:805M B:10 GPU1:558M GPU2:557M"

NeuralNetworks`MemoryUsageInfo[]
<|"System" -> 16987279360, "Kernel" -> 805609552, "Buckets" -> 10, 
 "GPU1" -> -6441992192, "GPU2" -> -6442254336|>
POSTED BY: Updating Name
Posted 2 years ago

Using NeuralNetworks`ClearCache[] appears to free some CPU memory, but it does nothing to the GB of data sitting in my valuable GPU memory.

This becomes a major problem when you are doing training and quickly trying to test architectures and restart the training process:

Task manager out of GPU memory

cudaMalloc retry failed: out of memory

POSTED BY: Alec Graves
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract