# FluxTraining.jl

Docs (master)

A powerful, extensible neural net training library.

FluxTraining.jl gives you an endlessly extensible training loop for deep learning. It is inspired by fastai.

It exposes a small set of extensible interfaces and uses them to implement

• hyperparameter scheduling
• metrics
• logging
• training history; and
• model checkpointing

Install using ]add FluxTraining.

Read getting started first and the user guide if you want to know more. See also the reference for detailed function documentation.

FluxTraining.jl is part of an ongoing effort to improve Julia’s deep learning infrastructure and will be the training library for the work-in-progress FastAI.jl. Drop by on the Julia Zulip and say hello in the stream #ml-ecosystem-coordination.