Use W&B to manage hyperparameter sweeps. Sweeps are useful for efficiently finding the best version of your model. This feature is not currently supported on Windows.
In your project repo, initialize your project from the command line:
The sweep configuration file specifies your training script, parameter ranges, search strategy and stopping criteria.
Here's an example config file:
program: train.pymethod: bayesmetric:name: val_lossgoal: minimizeparameters:learning-rate:min: 0.001max: 0.1optimizer:values: ["adam", "sgd"]
Run this from the command line to get a SWEEP_ID and a URL to track all your runs.
wandb sweep sweep.yaml # prints out SWEEP_ID.
Run one or more wandb agents with the SWEEP_ID. Agents will request parameters from the parameter server and launch your training script.
wandb agent SWEEP_ID