public static class Neurons.Softmax extends Neurons.Output
Neurons.ExpRectifier, Neurons.ExpRectifierDropout, Neurons.Input, Neurons.Linear, Neurons.Maxout, Neurons.MaxoutDropout, Neurons.Output, Neurons.Rectifier, Neurons.RectifierDropout, Neurons.Softmax, Neurons.Tanh, Neurons.TanhDropout| Constructor and Description |
|---|
Softmax(int units) |
| Modifier and Type | Method and Description |
|---|---|
protected void |
fprop(long seed,
boolean training,
int n)
Forward propagation
|
protected void |
setOutputLayerGradient(double target,
int mb,
int n)
Part of backpropagation for classification
Update every weight as follows: w += -rate * dE/dw
Compute dE/dw via chain rule: dE/dw = dE/dy * dy/dnet * dnet/dw, where net = sum(xi*wi)+b and y = activation function
|
bpropautoEncoderGradient, bpropOutputLayer, init, momentum, momentum, rate, toStringprotected void fprop(long seed,
boolean training,
int n)
Neuronsprotected void setOutputLayerGradient(double target,
int mb,
int n)
setOutputLayerGradient in class Neuronstarget - actual class label (integer)