README

Start here

Library

Reference

Tutorials

Other

Training loop

The training loop, of course, is the core component in FluxTraining.jl.

Once you have a Learner, the simplest way to start training is to run fit!(learner, n). This will train for n epochs of one training phase and one validation phase each.

Like the callback system, the training loop has an extensible interface based on multiple dispatch.

The key abstraction are Phases. In fact, the above fit!(learner, n) is just a shorthand for running fit!(learner, [TrainingPhase(), ValidationPhase()]) n times.

Each phase implements its own training logic which is run when calling fit!(learner, phase):

If this looks like a regular old training loop, you’re right! What makes the training loop customizable is the callback system. During training, Events are thrown that callbacks can hook into.

For a rundown of events, see Events.

Extending

Training and validation are currently the only two Phases implemented, but the interface can be extended by implementing fitepochphase! and fitbatchphase!. An example use case is implementing GAN training.