All types of Hyperparameter tuning Every data scientist and aspirant must need to know

Subhash Achutha
2 min readDec 31, 2021

--

https://github.com/balavenkatesh3322/hyperparameter_tuning

manual search

a.GridSearchCV (check every given parameter so take long time)

HalvingGridSearch https://towardsdatascience.com/11-times-faster-hyperparameter-tuning-with-halvinggridsearch-232ed0160155 https://towardsdatascience.com/faster-hyperparameter-tuning-with-scikit-learn-71aa76d06f12

b.RandomizedSearchCV (search randomly narrow down our time) with Scikit-learn, Scikit-Optimize,Hyperopt

HalvingRandomSearchCV

c.Bayesian Optimization , bayes search,Hyperband and BOHB

Bayesian search with Gaussian processes,bayesian search with Random Forests,Bayesian search with GBMs

Bayesian Optimization Using BoTorch https://analyticsindiamag.com/guide-to-bayesian-optimization-using-botorch/

hyperparameter optimization https://github.com/LiYangHart/Hyperparameter-Optimization-of-Machine-Learning-Algorithms

Hyperopt hyperas https://www.kdnuggets.com/2018/12/keras-hyperparameter-tuning-google-colab-hyperas.html

hypertune-using-scikit-optimize BayesSearchCV

HpBandSter https://github.com/automl/HpBandSter hpsklearn https://medium.com/mlearning-ai/automatic-hyperparameter-optimization-6a1692c2ebee

hypopt https://github.com/cgnorthcutt/hypopt https://medium.com/mlearning-ai/automatic-hyperparameter-optimization-6a1692c2ebee

HiPlot https://analyticsindiamag.com/this-new-tool-helps-developers-in-effective-hyperparameter-tuning/

OCTIS https://github.com/mind-lab/octis

hyperband https://neptune.ai/blog/hyperband-and-bohb-understanding-state-of-the-art-hyperparameter-optimization-algorithms

Spearmint https://github.com/JasperSnoek/spearmint/

NeuPy http://neupy.com/2016/12/17/hyperparameter_optimization_for_neural_networks.html#id24

Vizier

ConfigSpace https://automl.github.io/ConfigSpace/master/ https://towardsdatascience.com/tuning-xgboost-with-xgboost-writing-your-own-hyper-parameters-optimization-engine-a593498b5fba

NatureInspiredSearchCV https://github.com/timzatko/Sklearn-Nature-Inspired-Algorithms

d.Sequential Model Based Optimization(Tuning a scikit-learn estimator with skopt)

e.Optuna https://analyticsindiamag.com/hands-on-python-guide-to-optuna-a-new-hyperparameter-optimization-tool/

f.Genetic Algorithms

darwin-mendel Genetic Algorithm for Hyper-Parameter Tuning https://manishagrawal-datascience.medium.com/genetic-algorithm-for-hyper-parameter-tuning-1ca29b201c08

g.Keras tuner (Random Search Keras Tuner,HyperBand Keras Tuner,Bayesian Optimization Keras Tuner,Hyperas ) https://sukanyabag.medium.com/automated-hyperparameter-tuning-with-keras-tuner-and-tensorflow-2-0-31ec83f08a62

Keras Hyperparameter Tuning with aisaratuners Library https://aisaradeepwadi.medium.com/advance-keras-hyperparameter-tuning-with-aisaratuners-library-78c488ab4d6a

storm-tuner https://github.com/ben-arnao/StoRM https://medium.com/geekculture/finding-best-hyper-parameters-for-deep-learning-model-4df7a17546c2

Hyperas https://towardsdatascience.com/automating-hyperparameter-tuning-of-keras-model-4fe69b8dedee

Deep AutoViML https://github.com/AutoViML/deep_autoviml

h.Scikit-Optimize,Optuna,Hyperopt,Multi-fidelity Optimization

i.ray[tune] and aisaratuners https://towardsdatascience.com/choosing-a-hyperparameter-tuning-library-ray-tune-or-aisaratuners-b707b175c1d7

raytune https://docs.ray.io/en/master/tune/index.html

k.model_search https://github.com/google/model_search https://analyticsindiamag.com/hands-on-guide-to-model-search-a-tensorflow-based-framework-for-automl/

Optimize machine learning models https://www.tensorflow.org/model_optimization

Milano https://github.com/NVIDIA/Milano

Tree-structured Parzen Estimators - TPE

Hyperparameter Tuning with the HParams Dashboard

baytune https://www.kdnuggets.com/2021/03/automating-machine-learning-model-optimization.html

Dragonfly https://analyticsindiamag.com/guide-to-scalable-and-robust-bayesian-optimization-with-dragonfly/

Pywedge https://www.analyticsvidhya.com/blog/2021/02/interactive-widget-based-hyperparameter-tuning-and-tracking-in-pywedge/

CapsNet Hyperparameter Tuning with Keras https://towardsdatascience.com/scikeras-tutorial-a-multi-input-multi-output-wrapper-for-capsnet-hyperparameter-tuning-with-keras-3127690f7f28

GPyTorch: A Python Library For Gaussian Process Models https://analyticsindiamag.com/guide-to-gpytorch-a-python-library-for-gaussian-process-models/

Auto-PyTorch https://github.com/automl/Auto-PyTorch

l.SMAC https://www.automl.org/automated-algorithm-design/algorithm-configuration/smac/ https://towardsdatascience.com/automl-for-fast-hyperparameters-tuning-with-smac-4d70b1399ce6

m.faster Hyper Parameter Tuning(sklearn-nature-inspired-algorithms) https://pypi.org/project/sklearn-nature-inspired-algorithms/

n.talos Neural network and hyperparameter optimization using Talos https://www.analyticsvidhya.com/blog/2021/05/neural-network-and-hyperparameter-optimization-using-talos/

https://towardsdatascience.com/10-hyperparameter-optimization-frameworks-8bc87bc8b7e3

https://mlwhiz.com/blog/2020/02/22/hyperspark/?utm_campaign=100x-faster-hyperparameter-search-framework-with-pyspark&utm_medium=social_link&utm_source=missinglettr

DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective https://github.com/microsoft/DeepSpeed

o.shap-hypetune https://github.com/cerlymarco/shap-hypetune https://towardsdatascience.com/shap-for-feature-selection-and-hyperparameter-tuning-a330ec0ea104

mlmachine,Polyaxon,BayesianOptimization,Talos,SHERPA,Scikit-Optimize,GPyOpt

p.Hyperactive https://github.com/SimonBlanke/Hyperactive

Hyperopt, Optuna, and Ray,SCIKIT-OPTIMIZE,SMAC

HyperOpt http://hyperopt.github.io/hyperopt/ Optuna https://optuna.org/ Scikit-optimize https://scikit-optimize.github.io/stable/ SigOpt https://sigopt.com/

check my repository for more details

--

--