Compare validation tests
Overview
H2O Model Validation enables you to compare validation tests of the same type (to discover insights).
Instructions
To compare validation tests, consider the following instructions:
- In the H2O Model Validation navigation menu, click Tests.
- Click the Select tests toggle.
- In the tests table, select at least two validation tests of the same type. note
You can select multiple validation tests of different types, but you must select at least two of the same type. H2O Model Validation compares validation tests of the same type (for example, backtesting); therefore, validation tests of different types are not compared. H2O Model Validation organizes comparisons into tabs, each containing validation tests of a certain type.
- Click Compare. note
When comparing validation tests, H2O Model Validation displays specific metrics based on the compared validation tests. To learn more, see Comparison metrics: Validation tests.
Comparison metrics: Validation tests
H2O Model Validation offers certain comparison metrics based on the compared validation tests.
Adversarial similarity
H2O Model Validation offers the following metrics to understand the adversarial similarity tests you compare:
Graph: AUC score
The AUC score graph displays the area under the receiver operating characteristic (AUC) score given to each adversarial similarity test.
- X-axis: Test name of each adversarial similarity test
- Y-axis: AUC scores (given to each test)
Bar graph: Feature importance
The Feature importance bar graph displays the gain of the features in the adversarial similarity tests. Gain refers to the relative contribution of a feature towards the predictive values. A feature with a high gain value implies a higher impact on the process of generating predictions.
- X-axis: Feature name
- Y-axis: Gain
Backtesting
H2O Model Validation offers the following metrics to understand the backtesting validation tests you compare:
Graph: Test results
The Test results graph displays the Back-test values for each split date of the backtesting tests, where Back-test refers to the target distribution values of the backtesting test dataset.
- X-axis: Split dates
- Y-axis: Back-test scores
Graph: Validation results
The Validation results graph displays the Cross-validation values for each split date for the backtesting models. This graph can be helpful when estimating a model's fitness level to a dataset not used when training the model.
- X-axis: Split dates
- Y-axis: Cross-validation scores
Drift detection
H2O Model Validation offers the following metric to understand the drift detection tests you compare:
Bar graph: PSI scores
The PSI scores bar graph displays the population stability index (PSI) for the features in the drift detection tests.
- X-axis: Features
- Y-axis: PSI scores
Size dependency
H2O Model Validation offers the following metrics to understand the size dependency tests you compare:
Graph: Test results
The Test results graph displays the test {metric}
values for the size dependency tests obtained with different test dataset sizes. {Metric}
, in this case, refers to the scorer of the model of a validation test (for example, root mean square error (RMSE)).
- X-axis: Train data sizes
- Y-axis: Test scores
Graph: Validation results
The Validation results graph displays the validation {metric}
values for the size dependency tests obtained with different validation dataset sizes. {Metric}
, in this case, refers to the scorer of the model of a validation test (for example, root mean square error (RMSE)).
- X-axis: Train data sizes
- Y-axis: Validation scores
Calibration score
H2O Model Validation offers the following metric to understand the calibration score tests you compare:
Chart: Calibration scores
The calibration scores chart displays each target class's calibration score (Brier score) in the compared tests.
- X-axis: Target classes
- Y-axis: Calibration scores
Segment performance
H2O Model Validation offers the following metric to understand the segment performance tests you compare:
Table: Segment performances
The table displays the following informational points of each compared segment performance test:
Name | Description |
---|---|
Segment performance name | Name of the segment performance test. |
Drop columns | Dropped columns H2O Model Validation dropped during the segment performance test. |
Model | The model utilized by H2O Model Validation to run the segment performance test. . |
Number of bins | The number of bins H2O Model Validation utilized to split the primary dataset into segments by the bins of values of every variable and every pair of variables to generate results around the ability of the model to produce accurate predictions with different data segments. |
Primary dataset | Name of the dataset H2O Model Validation utilized during the segment performance test. |
- Submit and view feedback for this page
- Send feedback about H2O Model Validation to cloud-feedback@h2o.ai