Message Boards Message Boards

0
|
2291 Views
|
0 Replies
|
0 Total Likes
View groups...
Share
Share this post:

Does running NetModel[_, TargetDevice->"GPU"] have garbage collection?

If I use net=NetModel["some model"], then apply the model as net[{File[path to image1], File[path to image 2], ...}, TargetDevice->"GPU"] a few times on different sets of images, then I get the error below about exhausting my GPU memory. At which point any new run of net (even on a single image) will throw the same error. I suspect I need some way to purge GPU memory which I thought might be done by NetModel. If I just run net[{File[path to image1], File[path to image 2], ...all files}, TargetDevice->"GPU"], then it never throws the error, possibly managing BatchSize intelligently and not leaving a bunch of random GPU garbage? I ran into this issue because I was manually chopping my data into batches so I could estimate how long the full list of images would take to encode. I tried playing with BatchSize as an option for NetModel with no difference.

NetChain::gpumemex: Computation aborted: GPU memory exhausted. Try to specify a smaller BatchSize.

POSTED BY: Kyle Keane
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract