Exploratory action

HYPE

HYPErparameter-Free Optimization Algorithms by Online Self-Tuning
HYPErparameter-Free Optimization Algorithms by Online Self-Tuning

In machine learning, practitioners struggle with tuning the hyperparameters of optimization algorithms when training models. Indeed, hyperparameter tuning is crucial for achieving good model performance, as training heavily depends on it. To improve model training, this project aims to design the first algorithms that dynamically self-tune all hyperparameters during execution.

Inria teams involved

MALT

Contacts

Paul Viallard

Scientific leader