Name | Description | Type | Package | Framework |
Backpropagation | This class implements a backpropagation training algorithm for feed forward neural networks. | Class | org.encog.neural.networks.training.propagation.back | HeatonReasearch |
GradientWorker | Worker class for the mulithreaded training of flat networks. | Class | org.encog.neural.networks.training.propagation | HeatonReasearch |
ManhattanPropagation | One problem that the backpropagation technique has is that the magnitude of the partial derivative may be calculated too large or too small. | Class | org.encog.neural.networks.training.propagation.manhattan | HeatonReasearch |
PersistTrainingContinuation | Persist the training continuation. | Class | org.encog.neural.networks.training.propagation | HeatonReasearch |
Propagation | methods. | Class | org.encog.neural.networks.training.propagation | HeatonReasearch |
QuickPropagation | QPROP is an efficient training method that is based on Newton's Method. | Class | org.encog.neural.networks.training.propagation.quick | HeatonReasearch |
ResilientPropagation | One problem with the backpropagation algorithm is that the magnitude of the partial derivative is usually too large or too small. | Class | org.encog.neural.networks.training.propagation.resilient | HeatonReasearch |
RPROPConst | Constants used for Resilient Propagation (RPROP) training. | Class | org.encog.neural.networks.training.propagation.resilient | HeatonReasearch |
RPROPType | Allows the type of RPROP to be defined. | Class | org.encog.neural.networks.training.propagation.resilient | HeatonReasearch |
ScaledConjugateGradient | This is a training class that makes use of scaled conjugate gradient methods. | Class | org.encog.neural.networks.training.propagation.scg | HeatonReasearch |
TrainingContinuation | Allows training to be continued. | Class | org.encog.neural.networks.training.propagation | HeatonReasearch |