Once your revised network works, please ensure that the output
layer can have more than output and please use softmax
activation. Below is sample code you may use.
private double[] softmax(double[] inputs) {
double[] output = new double[inputs.length];
double max = inputs[0];
// Find max for numerical stability
for (int i = 1; i < inputs.length; i++) {
if (inputs[i] > max) max = inputs[i];
}
// Compute exp and sum
double sum = 0;
for (int i = 0; i < inputs.length; i++) {
output[i] = Math.exp(inputs[i] - max);
sum += output[i];
}
// Normalize
for (int i = 0; i < inputs.length; i++) {
output[i] /= sum;
}
return output;
}