Group Abstract Group Abstract

Message Boards Message Boards

5
|
10.9K Views
|
10 Replies
|
23 Total Likes
View groups...
Share
Share this post:

Mathematica v12.3: CUDA GPU still not working

I am using Mathematica V12.3 on an Aurora-Ryzen with GeForce RTX 2090 and v11.3 of the NVIDIA toolkit. All the latest paclets/drivers are installed.

There are multiple reports of Mathematica not working properly with NVIDIA CUDA (cf https://community.wolfram.com/groups/-/m/t/2141352). Unfortunately, these appear to have fallen on deaf ears at WR. This is a pity, as it means that there is no option but to use Python or Matlab, on which platforms deep learning applications work flawlessly.

This deficiency in GPU computation, together with the absence of any functionality for Reinforcement Learning, leaves the Wolfram Language trailing competitors in the important field of data science / machine learning that, I would argue, it should be at the forefront of.

Mathematica appears to be able to process graphics data with NetTrain using the GPU as previously reported in the prior post. However, when processing numerical data with NetTrain, or building an Anomaly detector using AnomalyDetection, it reverts to using the CPU, despite TargetDevice-> "GPU", as for instance in the attached notebooks.

These can be used to test your own Mathematica/CUDA GPU set-up and enable you to report back any issues.

POSTED BY: Jonathan Kinlay
10 Replies
POSTED BY: Jonathan Kinlay
POSTED BY: Jonathan Kinlay

About problems with other GPUs: By briefly going to the previous thread, all I can see about neural net functionality besides the 3090 problem is MacOS support (NVIDIA/Apple's fault, not ours) and a complaint about a faulty 12.2 update which we fixed a few days later with another update. I'm not going to comment on CUDALink because I'm not involved with it. I consider the GPU support on the ML side pretty solid: we've been successfully using NetTrain for our own internal projects on a variety of GPU models and machines (including AWS instances) for years. If you or any other user still have problems please contact tech support.

About numerical vs image data: There is absolutely no difference between them from the neural net perspective. Images are immediately turned into numerical data by NetEncoder["Image"] and fed to the network as such. I have ran your own example on CPU vs GPU on my laptop (Dell XPS 15, GTX 1650M) and GPU is actually showing an improvement:

t = AbsoluteTime[];
NetTrain[net, TrainingData, BatchSize -> 10000, TargetDevice -> "CPU"];
Print[AbsoluteTime[] - t];

24.876159

t = AbsoluteTime[];
NetTrain[net, TrainingData, BatchSize -> 10000, TargetDevice -> "GPU"];
Print[AbsoluteTime[] - t];

15.667683

With a larger net, the improvement is massive (don't set a large BatchSize here or memory will blow up)

TrainingData = 
  RandomReal[1, {10000, 4}] -> RandomReal[1, {10000, 4}];
net = NetChain[{500, Ramp, 500, Ramp, 500, Ramp, 4}];

t = AbsoluteTime[];
NetTrain[net, TrainingData, MaxTrainingRounds -> 5, 
  TargetDevice -> "CPU"];
Print[AbsoluteTime[] - t];

7.083551

t = AbsoluteTime[];
NetTrain[net, TrainingData, MaxTrainingRounds -> 5, 
  TargetDevice -> "GPU"];
Print[AbsoluteTime[] - t];

0.654267

Do you get similar results for CPU vs GPU timings (especially with the second example)?

It's an option at the top of the page, when you create a new post. But I don't see where it shows up after posting.

This is running on Windows 10.

I no longer see a definitive list of supported GPU cards anywhere on the Wolfram web site.

This GPU should be supported because (1) it's supported by NVIDIA and (2) It is supported by MMA, assuming we can take the results on CUDAInformation[] and InstallCUDA[] at face value (also, I have noted that GPU functionality works seamlessly with this card in Matlab).

If that is not the case, i.e. if this particular GPU is not supported by Mathematica for some reason, then that is an additional issue: either of both of of the CUDAInformation or InstallCUDA functions should report an issue if the specific card is not supported. Either that, or we need a new function CUDACompatibleQ[] to check for compatibility issues.

But again, from previous discussions, this is by no means the only GPU card experiencing difficulties with V12. See the previous post for details.

The purpose of posting here rather than simply going to customer support is that it gives other MMA users the opportunity to test their own configurations and publish the result. Hopefully that way users will get more traction with WR to focus resources on the problem and deal with it.

POSTED BY: Jonathan Kinlay
POSTED BY: Ahmed Elbanna

Not off-topic at all - thanks for the heads-up!

POSTED BY: Jonathan Kinlay

What is your question?

POSTED BY: Sander Huisman

It was posted under "Share an idea", not "Ask a question".

Still, I suppose the obvious question would be: "When can we expect WR to remedy the ongoing issues with GPU functionality, that have been extant since v12.0?"

And also: "When can we expect some Reinforcement learning capability to be forthcoming?"

POSTED BY: Jonathan Kinlay
POSTED BY: Sander Huisman
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard