Message Boards Message Boards

1
|
5170 Views
|
1 Reply
|
1 Total Likes
View groups...
Share
Share this post:

Neural Networks calling NetTrain with All

Posted 6 years ago

I initialized a simple neural network wit a single DotPlusLayer:

resource = ResourceObject["MNIST"];
trainingData = ResourceData[resource, "TrainingData"];
testData = ResourceData[resource, "TestData"];

(* Neural Network creation *)
lenet = NetChain[
   {FlattenLayer[], 10, SoftmaxLayer[]},
   "Output" -> NetDecoder[{"Class", Range[0, 9]}],
   "Input" -> NetEncoder[{"Image", {28, 28}, "Grayscale"}]
   ];

(* Neural Network initialization *)

lenet = NetInitialize[lenet]

then trained it (successfully) with:

result = NetTrain[lenet, trainingData, ValidationSet -> testData]

when trying to obtain a NetTrainResultsObject[…] by adding the all specifier it stopped being compileable, even when removing all additional parameters (ValidationSet).

result = NetTrain[lenet, trainingData, All]

This returned an error about the loss port, which didn't occur before adding the All. This seems odd to me since All should only change the return type if I am not mistaken.

Entire notebook in attachments.

POSTED BY: Simon Blank
Posted 2 years ago

The definition of All has been changed in NetTrain since then (and now reflects all the properties you can get from NetTrain, because why not, without using extra computational efforts). It helps you gain more insights on the training process, the evolution of weights and biases during training and also adds to the list of pretty visualisations you can get of training a network.

POSTED BY: Test Account
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract