Hugging Face

Hugging Face Transformers provides general-purpose architectures for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch.

To get training logged automatically, just install the library and log in:

pip install wandb
wandb login

The Trainer or TFTrainer will automatically log losses, evaluation metrics, model topology and gradients.

Advanced configuration is possible through wandb environment variables.

Additional variables are available with transformers:

Environment Variables

Options

WANDB_WATCH

  • gradients (default): Log histograms of the gradients

  • all: Log histograms of gradients and parameters

  • false: No gradient or parameter logging

WANDB_DISABLED

boolean: Set to true to disable logging entirely

Examples

We've created a few examples for you to see how the integration works:

Feedback

We'd love to hear feedback and we're excited to improve this integration. Contact us with any questions or suggestions.

Visualize Results

Explore your results dynamically in the W&B Dashboard. It's easy to look across dozens of experiments, zoom in on interesting findings, and visualize highly dimensional data.

Here's an example comparing BERT vs DistilBERT — it's easy to see how different architectures effect the evaluation accuracy throughout training with automatic line plot visualizations.