Nevergrad
A Python toolbox for performing gradient-free optimization
...It targets hyperparameter search, architecture search, control problems, and experimental tuning—domains in which gradient-based methods may fail or be inapplicable. The library provides an easy interface to define an optimization problem (parameter space, loss function, budget) and then experiment with multiple strategies—evolutionary algorithms, Bayesian optimization, bandit methods, genetic algorithms, etc. Nevergrad supports parallelization, budget scheduling, and multiple cost/resource constraints, allowing it to scale to nontrivial optimization problems. It includes visualization tools and diagnostic metrics to compare strategy performance, track parameter evolution, and detect stagnation.