Message Boards Message Boards

SoftmaxLayer with one-hot vectors

Posted 1 month ago

The quantum physics problem I am looking at is a rotated qubit. The setup involves a y-rotated qubit measured in the z-basis (hence spin-up and spin-down). This scenario typically includes modeling the quantum state of the qubit and analytically determining the measurement outcome probabilities, then generating measurement outcomes for training for particular rotations and using another set of test measurement data to infer the most probable rotation angle. So I discretize the measurement angles and generate measurement results for each discrete angle to use as training data. I plan to generate another set of measurement results for testing later. My training data involves associating elements of the form $${1, 1, 1, 0, 1, 1, 0, 1, 0, 1} -> {0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0}$$ where the measurements on the left are 1 if spin up and 0 if spin down, and the lists on the right are one-hot vector lists which correspond to the distinct discrete rotation angles from $[0, \pi]$.

Query: I am interested in using the SoftmaxLayer[], but I am getting an error when including it my NetTrain[] function. The error states: "Batch #1 will be skipped, because one or or more inputs provided to port "Output" was invalid: input is not an integer between 1 and 20. This batch will be ignored in subsequent training rounds. More information can be obtained via the "SkippedTrainingData" property."

The code runs if I remove the SoftmaxLayer[] from the NetTrain[] command. Can anyone advise on how to resolve this problem, many thanks for any assistance. The code in question is attached:

POSTED BY: Byron Alexander
2 Replies

You need to manually set the LossFunction option in NetTrain; by default when the final layer is a SoftmaxLayer, the CrossEntropyLossLayer["Index"] is used, which is why you are getting the errors about not having integer outputs.

You want to set: `

NetTrain[net, trainingData, LossFunction->CrossEntropyLossLayer["Probabilities"]]

Explore the CrossEntropyLossLayer documentation for more details.

POSTED BY: Joshua Schrier

Thanks for your response. Yes this I have discovered recently. If you have a chance please have a look at my more recent post . Thanks for your time, keep well.

POSTED BY: Byron Alexander
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
or Discard

Group Abstract Group Abstract