Module rustml::opt [] [src]

Module for optimization with gradient descent.

Example: Gradient descent

The following example minimizes the function f(x) = (x-2)² with gradient descent.

use rustml::opt::*;
use num::pow;

let opts = empty_opts()
    .iter(10)     // set the number of iterations to 10
    .alpha(0.1)   // set the learning reate
    .eps(0.001);  // stopping criterion

let r = opt(
    &|p| pow(p[0] - 2.0, 2),       // objective to be minimized: (x-2)^2
    &|p| vec![2.0 * (p[0] - 2.0)], // derivative
    &[4.0],                        // initial parameters
    opts                           // optimization options
);

for (iter, i) in r.fvals.iter().enumerate() {
    println!("error after iteration {} was {}", iter + 1, i.1);
}
println!("solution: {:?}", r.params);
assert!(r.params[0] - 2.0 <= 0.3);

See here for another example.

Structs

OptParams

Creates a container that holds the parameters for an optimization algorithm.

OptResult

The result of an optimization.

Functions

empty_opts

Returns an empty set of options for optimization algorithms.

opt

Minimizes an objective using gradient descent.

opt_hypothesis
plot_learning_curve

Plots the learning curve from an optimization result.