Skip to main content
Version: v0.14.0

Compare model (experiment) summaries

You can compare model summaries (experiments) to understand the similarities and differences between models (model summaries).

Instructions

To compare model summaries, consider the following instructions:

  1. In the H2O Model Validation navigation menu, click Experiments.
  2. In the experiments table, select at least two model summaries to compare.
    note

    You cannot compare a model (experiment) summary if its state is not done or you have not generated a dataset summary for the dataset. To learn how to create a summary for a dataset, see Create a model (experiment) summary.

  3. Click Compare.
    note

    A comparison table and a feature importance chart appear when comparing the selected model summaries. To learn more, see Comparison table and Chart: Feature importance.

Comparison table

Column nameDescription
Test NameThe name of the experiment.
ScorerThe scorer of the experiment.
Validation ScoreExperiment validation score value.
Test ScoreExperiment test score value.
AccuracyExperiment accuracy value.
TimeExperiment time value.
InterpretabilityExperiment interpretability value.
TaskExperiment problem type (e.g., regression).
TargetExperiment target column (target feature).
Dropped Columns Dropped columns that Driverless AI dropped during the experiment to not use as predictors.
Train Data NameName of the experiment train dataset.
Train Data Shape The number of rows and columns in the experiment train dataset ((rows, columns)).
Test Data NameName of the experiment test dataset.
Test Data ShapeThe number of rows and columns in the experiment test dataset ((rows, columns)).

Chart: Feature importance

The feature importance chart displays all the features of the compared models (model summaries).

  • X-axis: Feature name
  • Y-axis: Gain value (the importance of the feature in the model)

chart-feature-importance.png


Feedback