Hey guys,

I just wanted to quickly share how I was optimizing hyperparameters in XGBoost using bayes_opt.

It does a k-fold cross validation while optimizing for stable parameters.

Keep in mind that bayes_opt maximizes the objective function, so change all the required hardcoded values along those lines to fit your problem. It's pretty compact, so I thought I just leave it here for your convenience as a gist.

Cheers,

Thomas

Thank you so much Thomas. Your code makes the Bayesian Optimization clear.

ReplyDeleteThanks a lot! It is a very nice example

ReplyDeleteHello Thomas. Could you explain if and how this approach is better then the 'hypteropt' approach?

ReplyDeleteWhy did you chose

ReplyDeleteinit_points = 50, n_iter = 5 ?

What does

kappa = 2 mean?

acq = "ei" mean?

xi = 0.0 mean?

Thanks!