Ray Tune Sweeps

✨BETA✨ support for Ray Tune sweep search and scheduler API

Ray Tune is a scalable hyperparameter tuning library. We're adding support for Tune to W&B Sweeps, which makes it easy to launch runs on many machines and visualize results in a central place.

This feature is in beta! We love feedback, and we really appreciate hearing from folks who are experimenting with our Sweeps product.

Here's a quick example:

import wandb
from wandb.sweeps.config import tune
from wandb.sweeps.config.tune.suggest.hyperopt import HyperOptSearch
from wandb.sweeps.config.hyperopt import hp
tune_config = tune.run(
"train.py",
search_alg=HyperOptSearch(
dict(
width=hp.uniform("width", 0, 20),
height=hp.uniform("height", -100, 100),
activation=hp.choice("activation", ["relu", "tanh"])),
metric="mean_loss",
mode="min"),
num_samples=10)
# Save sweep as yaml config file
tune_config.save("sweep-hyperopt.yaml")
# Create the sweep
wandb.sweep(tune_config)

See full example on GitHub →

Feature Compatibility

Search Algorithms

Ray/Tune Search Algorithms

Search Algorithm

Support

HyperOpt

Supported

Grid Search and Random Search

Partial

BayesOpt

Planned

Nevergrad

Planned

Scikit-Optimize

Planned

Ax

Planned

BOHB

Planned

HyperOpt

HyperOpt Feature

Support

hp.choice

Supported

hp.randint

Planned

hp.pchoice

Planned

hp.uniform

Supported

hp.uniformint

Planned

hp.quniform

Planned

hp.loguniform

Supported

hp.qloguniform

Planned

hp.normal

Planned

hp.qnormal

Planned

hp.lognormal

Planned

hp.qlognormal

Planned

Tune Schedulers

By default, Tune schedules runs in serial order. You can also specify a custom scheduling algorithm that can stop runs early or perturb parameters. Read more in the Tune docs →

Scheduler

Support

Population Based Training (PBT)

Investigating

Asynchronous HyperBand

Planned

HyperBand

Investigating

HyperBand (BOHB)

Investigating

Median Stopping Rule

Investigating