# User Portlet

Matteo Salvarezza
Discussions
We haven't added any feature to ImageAugmentationLayer, but I don't see any reason why you shouldn't be able to achieve what you want by using the "generator" syntax of NetTrain as I mentioned in my old reply. Your generator function should grab...
About problems with other GPUs: By briefly going to the previous thread, all I can see about neural net functionality besides the 3090 problem is MacOS support (NVIDIA/Apple's fault, not ours) and a complaint about a faulty 12.2 update which we...
Nice work! Just a small comment on this part: > Second limitation : During training Mathematica is not keeping track > of the statistics of the intermediate values (input and output of > layers). > > [...] > > So, to get those statistics I...
Great! As a developer in the Wolfram ML team, it's always gratifying to see people doing interesting stuff with what we provide. There are a couple of comments i'd ike to make about this: First, you evaluate the final performance using the...
> But you also cut out a number of convolution layers, they go up to > conv5_3. Yes, that's because deeper convolutional layers start to behave as the final linear layers: too much information is discarded. Just try chopping the net to one of...
Just a comment to Christopher's answer. There is an efficient workflow for transfer learning in general that is worth mentioning: instead of keeping the pre-trained part of the net active during training, use that to extract features from the...