This document (including, without limitation, any product roadmap or statement of direction data) illustrates the planned testing, release and availability dates for Spotfire products and services. It is for informational purposes only and its contents are subject to change without notice. Planning to implement - generally 6-12 months out. Likely to Implement - generally means 12-18 months out.
Copyright © 2014-2023 Cloud Software Group, Inc. All Rights Reserved.
Cloud Software Group, Inc. ("Company") follows the EU Standard Contractual Clauses as per the Company's Data Processing Agreement.
Terms of Use |
Privacy Policy |
Trademarks |
Patents |
Contact Us
I just attended a talk where based on the early results of multiple experiments that you drop out poor performers and only train a few to completion. Maybe this idea could be implemented here as well. The grid search could narrow down which combination of tuning parameters is optimal and then you could run that model to completion to find the optimal number of trees
For clarification, obviously we can manually tune the parameters by changing one at a time. This request is about creating the ability to automatically search across some defined space of parameters and then show the results of the search and output the model. The ability to loop through the analysis over the defined hyperparameter space is the missing piece right now.