Per your interest on predicting probabilities, you can compute the probabilities for classification, by using the "net3" and the "trainedNet3" which use the SoftmaxLayer and the output as a string (e.g., "1", not 1.0.
{finalNet3[{9987, 13}, "Probabilities"]
finalNet3[{5062, 4938}, "Probabilities"]
finalNet3[{0, 10000}, "Probabilities"] }
Note that the following three samples are classified correctly. For example, using the 1st, 25th, and 50th samples, the 3 samples are correctly classified with the highest probabilities.
- {9987,13} -> "1": {9987,13} show the highest probability to be classified as "1"
- {5062,4938}->"25": {5062,4938} show the highest probability to be classified as "25"
- {0,10000}-> "50": {0,10000} show the highest probability to be classified as "50"
I am not sure what you mean by: "the output in training has inverted commas " ", that is {{9988, 12} -> "1"} instead of {{9988, 12} -> 1}." I think you mean string not a number.
You can compute the probabilities for each class by using the SoftmaxLayer but the SoftmaxLayer requires the output to be a string, not a number, which I called classification.
By the way, you can check and plot the sensitivities of each input by changing the 1st input while holding the 2nd input constant and vice versa.
Table[{x1, finalNet3[{x1, 5000}]}, {x1, 4000, 6000, 100}]
Hope this helps.