AIToolbox
A library that offers tools for AI problem solving.
AIToolbox::Adam Class Reference

This class implements the ADAM gradient descent algorithm. More...

#include <AIToolbox/Utils/Adam.hpp>

Public Member Functions

 Adam (AIToolbox::Vector *point, const AIToolbox::Vector &gradient, double alpha=0.001, double beta1=0.9, double beta2=0.999, double epsilon=1e-8)
 Basic constructor. More...
 
void step ()
 This function updates the point using the currently set gradient. More...
 
void reset ()
 This function resets the gradient descent process. More...
 
void reset (AIToolbox::Vector *point, const AIToolbox::Vector &gradient)
 This function resets the gradient descent process. More...
 
void setAlpha (double alpha)
 This function sets the current learning rate. More...
 
void setBeta1 (double beta1)
 This function sets the current exponential decay rate for first moment estimates. More...
 
void setBeta2 (double beta2)
 This function sets the current exponential decay rate for second moment estimates. More...
 
void setEpsilon (double epsilon)
 This function sets the current additive division parameter. More...
 
double getAlpha () const
 This function returns the current learning rate. More...
 
double getBeta1 () const
 This function returns the current exponential decay rate for first moment estimates. More...
 
double getBeta2 () const
 This function returns the current exponential decay rate for second moment estimates. More...
 
double getEpsilon () const
 This function returns the current additive division parameter. More...
 

Detailed Description

This class implements the ADAM gradient descent algorithm.

This class keeps things simple and fast. It takes two pointers to two equally-sized vectors; one used to track the currently examined point, and the other to provide Adam with the gradient.

This class expects you to compute the gradient of the currently examined point. At each step() call, the point vector is updated following the gradient using the Adam algorithm.

We take pointers rather than references so that the pointers can be updated as needed, while the class instance kept around. This only works if the new vectors have the same size as before, but it allows to avoid reallocation of the internal helper vectors.

Constructor & Destructor Documentation

◆ Adam()

AIToolbox::Adam::Adam ( AIToolbox::Vector point,
const AIToolbox::Vector gradient,
double  alpha = 0.001,
double  beta1 = 0.9,
double  beta2 = 0.999,
double  epsilon = 1e-8 
)

Basic constructor.

We expect the pointers to not be null, and the vectors to be preallocated.

The point vector should contain the point where to start the gradient descent process. The gradient vector should contain the gradient at that point.

Parameters
pointA pointer to preallocated space where to write the point.
gradientA reference to preallocated space containing the current gradient.
alphaAdam's step size/learning rate.
beta1Adam's exponential decay rate for first moment estimates.
beta2Adam's exponential decay rate for second moment estimates.
epsilonAdditive parameter to prevent division by zero.

Member Function Documentation

◆ getAlpha()

double AIToolbox::Adam::getAlpha ( ) const

This function returns the current learning rate.

◆ getBeta1()

double AIToolbox::Adam::getBeta1 ( ) const

This function returns the current exponential decay rate for first moment estimates.

◆ getBeta2()

double AIToolbox::Adam::getBeta2 ( ) const

This function returns the current exponential decay rate for second moment estimates.

◆ getEpsilon()

double AIToolbox::Adam::getEpsilon ( ) const

This function returns the current additive division parameter.

◆ reset() [1/2]

void AIToolbox::Adam::reset ( )

This function resets the gradient descent process.

This function clears all internal values so that the gradient descent process can be restarted from scratch.

The point vector is not modified.

◆ reset() [2/2]

void AIToolbox::Adam::reset ( AIToolbox::Vector point,
const AIToolbox::Vector gradient 
)

This function resets the gradient descent process.

This function clears all internal values so that the gradient descent process can be restarted from scratch.

The point pointer and gradient reference are updated with the new inputs.

◆ setAlpha()

void AIToolbox::Adam::setAlpha ( double  alpha)

This function sets the current learning rate.

◆ setBeta1()

void AIToolbox::Adam::setBeta1 ( double  beta1)

This function sets the current exponential decay rate for first moment estimates.

◆ setBeta2()

void AIToolbox::Adam::setBeta2 ( double  beta2)

This function sets the current exponential decay rate for second moment estimates.

◆ setEpsilon()

void AIToolbox::Adam::setEpsilon ( double  epsilon)

This function sets the current additive division parameter.

◆ step()

void AIToolbox::Adam::step ( )

This function updates the point using the currently set gradient.

This function overwrites the vector pointed by the point pointer, by following the currently set gradient.

It is expected that the gradient is correct and has been updated by the user before calling this function.


The documentation for this class was generated from the following file: