Skip to Main Content
Spotfire Ideas Portal

tree model tuning

For tree based models (RF and GBM) it would be advantageous for the hyper parameters to be tuned.  Currently the optimal number of trees is determined using stopping parameters.  I am specifically referring to the ability to tune the learning rate, hold out percentage, max number of trees, etc. 

  • Attach files
      Drop here to upload
    • Guest
      Reply
      |
      May 2, 2019

      I just attended a talk where based on the early results of multiple experiments that you drop out poor performers and only train a few to completion.  Maybe this idea could be implemented here as well.  The grid search could narrow down which combination of tuning parameters is optimal and then you could run that model to completion to find the optimal number of trees

    • Guest
      Reply
      |
      May 2, 2019

      For clarification, obviously we can manually tune the parameters by changing one at a time.  This request is about creating the ability to automatically search across some defined space of parameters and then show the results of the search and output the model.  The ability to loop through the analysis over the defined hyperparameter space is the missing piece right now.