Prediction intervals support
To enable support for prediction intervals in H2O MLOps, set requestPredictionIntervals
parameter to true
. Note that if prediction intervals are not supported by the model or not returned for some reason, H2O MLOps will either leave the field empty or return an error response. This mechanism ensures that you are aware when prediction intervals are not available.
Once enabled, the prediction intervals are returned as an array. For each prediction, a lower and upper bound are returned.
Prediction interval support is currently only available for regression models in all MOJO runtimes and the Driverless AI scoring pipeline runtime.
Step 1: Check if the deployment has requestPredictionIntervals
support in a curl request
Use the /capabilities
endpoint to confirm that the model supports prediction intervals.
curl -X GET https://<DEPLOYMENT_URL>/model/capabilities
["SCORE_PREDICTION_INTERVAL","SCORE","CONTRIBUTION_ORIGINAL","CONTRIBUTION_TRANSFORMED"]
The SCORE_PREDICTION_INTERVAL
capability indicates that prediction intervals are supported.
To try this using H2O MLOps Python client, see View scorer capabilities.
Step 2: Make a prediction with requestPredictionIntervals
enabled
To request prediction intervals using the H2O MLOps Python client, see Prediction intervals.
Sample output:
{'fields': ['score'],
'id': '1b0488b6-ee91-11ed-a05d-4ab989c17db4',
'predictionIntervals': {'fields': ['score.lower',
'score.upper'],
'rows': [['55524.044704861124', '371673.57907986105']]},
'score': [['200005.90798611112']]}
- Submit and view feedback for this page
- Send feedback about H2O MLOps to cloud-feedback@h2o.ai