For scripts using fast.ai, we have a callback that can automatically log model topology, losses, metrics, weights, gradients, sample predictions and best trained model.
import wandbfrom wandb.fastai import WandbCallbackwandb.init()learn = cnn_learner(data,model,callback_fns=WandbCallback)learn.fit(epochs)
Requested logged data is configurable through the callback constructor.
from functools import partiallearn = cnn_learner(data, model, callback_fns=partial(WandbCallback, input_type='images'))
It is also possible to use WandbCallback only when starting training. In this case it must be instantiated.
Custom parameters can also be given at that stage.
learn.fit(epochs, callbacks=WandbCallback(learn, input_type='images'))
We've created a few examples for you to see how the integration works:
Semantic Segmentation with Fastai: Optimize neural networks on self-driving cars
Track and compare Fastai model performance, and visualize results in a live dashboard
Run in colab: A simple notebook example to get you started
WandbCallback() class supports a number of options:
the fast.ai learner to hook.
save the model if it's improved at each step. It will also load best model at the end of training.
'min', 'max', or 'auto': How to compare the training metric specified in
training metric used to measure performance for saving the best model. None defaults to validation loss.
"gradients", "parameters", "all", or None. Losses & metrics are always logged.
"images" or None. Used to display sample predictions.
data used for sample predictions if input_type is set.
number of predictions to make if input_type is set and validation_data is None.
initialize random generator for sample predictions if input_type is set and validation_data is None.