Hugging Face Transformers provides general-purpose architectures for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch.
To get training logged automatically, just install the library and log in:
pip install wandbwandb login
TFTrainer will automatically log losses, evaluation metrics, model topology and gradients.
Advanced configuration is possible through wandb environment variables.
Additional variables are available with transformers:
boolean: Set to true to disable logging entirely
We've created a few examples for you to see how the integration works:
Run in colab: A simple notebook example to get you started
A step by step guide: track your Hugging Face model performance
Does model size matter? A comparison of BERT and DistilBERT
We'd love to hear feedback and we're excited to improve this integration. Contact us with any questions or suggestions.
Explore your results dynamically in the W&B Dashboard. It's easy to look across dozens of experiments, zoom in on interesting findings, and visualize highly dimensional data.
Here's an example comparing BERT vs DistilBERT — it's easy to see how different architectures effect the evaluation accuracy throughout training with automatic line plot visualizations.