Oct 15, 2016

XGBoost bayesian hyperparameter tuning with bayes_opt in Python

Hey guys,

I just wanted to quickly share how I was optimizing hyperparameters in XGBoost using bayes_opt.


It does a k-fold cross validation while optimizing for stable parameters.
Keep in mind that bayes_opt maximizes the objective function, so change all the required hardcoded values along those lines to fit your problem. It's pretty compact, so I thought I just leave it here for your convenience as a gist.

Cheers,
Thomas

4 comments:

  1. Thank you so much Thomas. Your code makes the Bayesian Optimization clear.

    ReplyDelete
  2. Thanks a lot! It is a very nice example

    ReplyDelete
  3. Hello Thomas. Could you explain if and how this approach is better then the 'hypteropt' approach?

    ReplyDelete
  4. Why did you chose
    init_points = 50, n_iter = 5 ?

    What does
    kappa = 2 mean?
    acq = "ei" mean?
    xi = 0.0 mean?

    Thanks!

    ReplyDelete