tag:blogger.com,1999:blog-7181711759016870742.post3415754208714467004..comments2022-05-10T12:26:08.070+02:00Comments on Thomas Jungblut's Blog: XGBoost bayesian hyperparameter tuning with bayes_opt in PythonThomas Jungbluthttp://www.blogger.com/profile/07157841886768146088noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-7181711759016870742.post-30220335328759599242017-11-12T16:59:23.575+01:002017-11-12T16:59:23.575+01:00Why did you chose
init_points = 50, n_iter = 5 ?
...Why did you chose <br />init_points = 50, n_iter = 5 ?<br /><br />What does<br />kappa = 2 mean?<br />acq = "ei" mean?<br />xi = 0.0 mean?<br /><br />Thanks!Anonymoushttps://www.blogger.com/profile/06461664329050337041noreply@blogger.comtag:blogger.com,1999:blog-7181711759016870742.post-12474576014003257622017-10-31T09:11:45.741+01:002017-10-31T09:11:45.741+01:00Hello Thomas. Could you explain if and how this ap...Hello Thomas. Could you explain if and how this approach is better then the 'hypteropt' approach?Anonymoushttps://www.blogger.com/profile/06461664329050337041noreply@blogger.comtag:blogger.com,1999:blog-7181711759016870742.post-35186108861834726822017-10-17T06:56:35.832+02:002017-10-17T06:56:35.832+02:00Thanks a lot! It is a very nice exampleThanks a lot! It is a very nice exampleArchnushttps://www.blogger.com/profile/12758118241036881264noreply@blogger.comtag:blogger.com,1999:blog-7181711759016870742.post-90387366568300845692016-10-23T21:51:56.747+02:002016-10-23T21:51:56.747+02:00Thank you so much Thomas. Your code makes the Baye...Thank you so much Thomas. Your code makes the Bayesian Optimization clear.byronyhttps://www.blogger.com/profile/17727800710680774144noreply@blogger.com