API Reference¶
Main optimization loops¶
Gradient algorithms¶
First order gradient descent algorithms
-
descent.algorithms.
sgd
¶ alias of
GradientOptimizer
-
descent.algorithms.
nag
¶ alias of
GradientOptimizer
-
descent.algorithms.
rmsprop
¶ alias of
GradientOptimizer
-
descent.algorithms.
sag
¶ alias of
GradientOptimizer
-
descent.algorithms.
smorms
¶ alias of
GradientOptimizer
-
descent.algorithms.
adam
¶ alias of
GradientOptimizer
Proximal operators¶
Proximal operators / mappings
Objectives¶
Example objectives
Utilities¶
-
descent.utils.
check_grad
(f_df, xref, stepsize=1e-06, tol=1e-06, width=15, style=’round’, out=<_io.TextIOWrapper name=’<stdout>’ mode=’w’ encoding=’UTF-8’>)[source]¶ Compares the numerical gradient to the analytic gradient
Parameters: - f_df (function) – The analytic objective and gradient function to check
- x0 (array_like) – Parameter values to check the gradient at
- stepsize (float, optional) – Stepsize for the numerical gradient. Too big and this will poorly estimate the gradient. Too small and you will run into precision issues (default: 1e-6)
- tol (float, optional) – Tolerance to use when coloring correct/incorrect gradients (default: 1e-5)
- width (int, optional) – Width of the table columns (default: 15)
- style (string, optional) – Style of the printed table, see tableprint for a list of styles (default: ‘round’)
-
descent.utils.
destruct
(*args, **kwargs)¶ Deconstructs the input into a 1-D numpy array
-
descent.utils.
restruct
(*args, **kwargs)¶ Reshapes the input into the type of the second argument