Hey guys, looking to get some feedback on my approach.
I am using TPOT, an autoML tool, to tweak the parameters of an XGBoost model. It builds a population of models which then reproduce or die, creating a new generation etc.
Training is done on the full training set and cross-verified using @mdo’s custom TimeSeriesSplitGroups.
A quick example notebook can be found here: https://www.kaggle.com/jorijnsmit/xgboost-parameter-tuning-using-genetic-programming.
My biggest concerns are what you think of the ranges of parameters I feed TPOT. Am I going too broad here or should I push it even further?