Gradient-based Single objective optimizers

Gradient-Based optimizers use the gradient of an objective function to find a local minimum in the search space. If a function is not convex this local optimum might not be the global one.

+ Collaboration diagram for Gradient-based Single objective optimizers:

Classes

class  shark::AbstractLineSearchOptimizer< SearchPointType >
 Basis class for line search methods. More...
 
class  shark::Adam< SearchPointType >
 Adaptive Moment Estimation Algorithm (ADAM) More...
 
class  shark::BFGS< SearchPointType >
 Broyden, Fletcher, Goldfarb, Shannon algorithm for unconstraint optimization. More...
 
class  shark::CG< SearchPointType >
 Conjugate-gradient method for unconstrained optimization. More...
 
class  shark::LBFGS< SearchPointType >
 Limited-Memory Broyden, Fletcher, Goldfarb, Shannon algorithm. More...
 
class  shark::LineSearch< SearchPointType >
 Wrapper for the linesearch class of functions in the linear algebra library. More...
 
class  shark::Rprop< SearchPointType >
 This class offers methods for the usage of the Resilient-Backpropagation-algorithm with/out weight-backtracking. More...
 
class  shark::SteepestDescent< SearchPointType >
 Standard steepest descent. More...
 
class  shark::TrustRegionNewton
 Simple Trust-Region method based on the full Hessian matrix. More...