Available in: GBM, DRF, Deep Learning, GLM, GAM, AutoML, XGBoost, Isolation Forest
Use this option to stop model training when the option selected for stopping_metric doesn’t improve for this specified number of training rounds, based on a simple moving average. For example, given the following options:
then the moving average for last 4 stopping rounds is calculated (the first moving average is reference value for other 3 moving averages to compare).
The model will stop if the ratio between the best moving average and reference moving average is more or equal 1-1e-3 (the misclassification is the less the better metric, for the more the better metrics the ratio have to be less or equal 1+1e-3 to stop).
These stopping options are used to increase performance by restricting the number of models that get built.
The default value for this option varies depending on the algorithm:
stopping_roundsdefaults to 0 (disabled)
stopping_roundsdefaults to 5
To disable this feature, specify 0. When disabled, the metric is computed on the validation data (if provided); otherwise, training data is used.
When used with Deep Learning, you can also specify the
overwrite_with_best_model option. When enabled, the final model is the best model generated for the given
Keep in mind that
stopping_rounds does not refer to epochs, but more specifically to the number of scoring events (which can only happen after every iteration).
Notes: If cross-validation is enabled:
All cross-validation models stop training when the validation metric doesn’t improve.
The main model runs for the mean number of epochs.
N+1 models do not use
overwrite_with_best_model, which is an available option in Deep Learning.
N+1 models may be off by the number specified for
stopping_roundsfrom the best model, but the cross-validation metric estimates the performance of the main model for the resulting number of epochs (which may be fewer than the specified number of scoring events).
stopping_roundsmust be enabled for