Package mklab.JGNN.nn
Interface Optimizer
- All Known Implementing Classes:
Adam
,BatchOptimizer
,GradientDescent
,Regularization
public interface Optimizer
Provides an interface for training tensors. Has a
reset()
method
that starts potential training memory from scratch. Has an
update(Tensor, Tensor)
method that, given a current Tensor and a
gradient operates on the former and adjusts its value.- Author:
- Emmanouil Krasanakis
-
Method Summary
-
Method Details
-
update
In-place updates the value of a tensor given its gradient. Some optimizers (e.g. Adama) require the exact same tensor instance to be provided so as to keep track of its optimization progress. The library makes sure to keep this constraint.- Parameters:
value
- The tensor to update.gradient
- The tensor's gradient.
-
reset
default void reset()Resets (and lets the garbage collector free) optimizer memory. Should be called at the beginning of training (not after each epoch).
-