How would I supply this kind of training data? I have n output classes, but instead of associating a definite class with each training input, I would like to associate an n-dimensional probability vector for the classes with each input, or alternatively have each input repeat several times so I can sample the probability distribution to get definite answers for the classes.. The first option would be preferable.
Just look at first one.
The second option seems to work OK, it doesn't complain about having the same input with different outputs for the training set. But what about the first option?
I thought this might work, but there is something wrong with the syntax. Help?
Try moving the Loss function definition into NetTrain:
famNet = NetChain[{linfam, SoftmaxLayer[]}]; famTrained = NetTrain[famNet, trainingdatafam, LossFunction -> CrossEntropyLossLayer["Probabilities"], ValidationSet -> None];
Yes, it works, thanks. And sorry that I hadn't posted the working solution, which either Giulio or someone else alerted me to.