Message Boards Message Boards

GROUPS:

Add a SoftmaxLayer to trainable network with vector input 14x14 pixel pics?

Posted 3 months ago
226 Views
|
0 Replies
|
0 Total Likes
|

Dear all,

I have a working trainable network. The input is a vector of 14 x 14 pixel images, the output is a vector with three elements. Now I would like the output vector to be normalized. For this reason I added a SoftmaxLayer[], however NetTrain now complains "expected a vector of 1 to 3 indices."

Here's the working network:

...more layers...
layer10 = LinearLayer[{3}];
layer11 = ElementwiseLayer[ Tanh ];
Net = NetChain[{layer1, layer2, layer3, layer4, layer5, layer6,  layer7, layer8, layer9, layer10, layer11}];

Here's the code that leads to the error from NetTrain:

...more layers...
    layer10 = LinearLayer[{3}];
    layer11 = ElementwiseLayer[ Tanh ];
    layer12 = FlattenLayer[];
    layer13 = SoftmaxLayer[];
    Net = NetChain[{layer1, layer2, layer3, layer4, layer5, layer6, 
    layer7, layer8, layer9, layer10, layer11, layer12, layer13}];
    TrainedNet =  NetTrain[Net, <|   "Input" -> Images,  "Output" -> Labels|>, {MaxTrainingRounds ->     256, Method -> "ADAM", BatchSize -> 128, ValidationSet -> Scaled[0.2]}]

(NetTrain::notiintvec: Expected a vector of indices between 1 and 3.):

It seems to be required to add a FlattenLayer[], because the ElementwiseLayer returns a layer of tensors due to Tanh[]. As per examples, it should be working without the flattening. Can anyone point me to where I have gone wrong?

Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract