Get drift metrics
This example demonstrates how you can get the drift metrics for a deployed model by using the model monitoring service of the MLOps API. This provides metrics such as drift, drift threshold, and importance of each feature of the model, and categorizes the feature drifts as low, medium, or high. These drift metrics can then be used to evaluate the model drift. Currently, only single model deployments are supported by the model monitoring service.
You will need the values for the following constants in order to successfully carry out the task. Contact your administrator to obtain deployment specific values.
Constant | Value | Description |
---|---|---|
MLOPS_API_URL | Usually: https://api.mlops.my.domain | Defines the URL for the MLOps Gateway component. |
TOKEN_ENDPOINT_URL | https://mlops.keycloak.domain/auth/realms/[fill-in-realm-name]/protocol/openid-connect/token | Defines the token endpoint URL of the Identity Provider. This uses Keycloak as the Identity Provider. Keycloak Realm should be provided. |
REFRESH_TOKEN | <your-refresh-token> | Defines the user's refresh token |
CLIENT_ID | <your-client-id> | Sets the client id for authentication. This is the client you will be using to connect to MLOps. |
CLIENT_SECRET | <your-client-secret> | Sets the client secret. |
DEPLOYMENT_ID | <your-deployment-id> | Defines a deployment id that the script will be using. |
The following steps demonstrate how you can use the MLOps Python client to get the drift metrics for a deployed model.
Change the values of the following constants in your
GetDriftMetrics.py
file as given in the preceding data table.GetDriftMetrics.py### Constants
MLOPS_API_URL = <MLOPS_API_URL>
TOKEN_ENDPOINT_URL = <TOKEN_ENDPOINT_URL>
REFRESH_TOKEN = <REFRESH_TOKEN>
CLIENT_ID = <CLIENT_ID>
CLIENT_SECRET = <CLIENT_SECRET>
DEPLOYMENT_ID = <DEPLOYMENT_ID>GetDriftMetrics.py### Constants
MLOPS_API_URL = "https://api.mlops.my.domain"
TOKEN_ENDPOINT_URL="https://mlops.keycloak.domain/auth/realms/[fill-in-realm-name]/protocol/openid-connect/token"
REFRESH_TOKEN="<your-refresh-token>"
CLIENT_ID="<your-mlops-client>"
CLIENT_SECRET = "<your-client-secret>"
DEPLOYMENT_ID = "f9fa4db1-2f30-4b10-ace2-f383a9f74880"Run the
GetDriftMetrics.py
file.python3 GetDriftMetrics.py
This provides the drift metrics for the model of the specified deployment.
Deployment drift metrics: {'count_feature_drift': 13,
'feature_frequency': [{'categorical': {'description': '',
'name': '',
'point': [],
'x_label': '',
'y_label': ''},
'continuous': {'bin': [{'x_max': '13.0',
'x_min': '0.0',
'y': '0.1282399547255235'},
....
'feature_importance': {'drift_threshold': 0.2,
'feature_drift': [{'drift': [{'value': '',
'x_axis': '2022-08-17T08:49:53Z',
'y_axis': '0.03827751196172248'},
....
'feature_summary': {'feature': [{'data_type': 'double',
'drift': 0.32509654856645576,
'impact': 0.0,
'importance': 0.0,
'missing_values': 0,
'name': 'exang'},
....
'high_feature_drift': 0,
'low_feature_drift': 0,
'medium_feature_drift': 0}
Example walkthrough
This section provides a walkthrough of the GetDriftMetrics.py
file.
Set up the token provider using an existing refresh token and client secret.
List all the monitored deployments of the user by calling the
list_monitored_deployments
endpoint of the model monitoring service.GetDriftMetrics.pydeployments: mlops.ApiListMonitoredDeploymentsResponse = (
mlops_client.model_monitoring.monitoring_service.list_monitored_deployments()
)Select the specified deployment from the list of all monitored deployments by using the defined
DEPLOYMENT_ID
.GetDriftMetrics.pyfor deployment in deployments.deployment:
if deployment.id == DEPLOYMENT_ID:
selected_deployment = deployment
break
else:
raise LookupError("Requested project not found")Define the start datetime and end datetime to get the drift metrics. In this example, the datetime range is defined as 30 days.
GetDriftMetrics.pyend_time = datetime.datetime.now(pytz.utc)
start_time = end_time - datetime.timedelta(days=30)Finally, call the
get_model_drift_metrics
endpoint of the model monitoring service to get the drift metrics of the model.NoteA deployment can have multiple models with A/B Test deployment and Champion/ Challenger deployment types. However, only single deployments are currently supported by the model monitoring service. Therefore, the
model_id
parameter is ignored for now, and the API returns the drift metrics for the first available model of the deployment.GetDriftMetrics.pydrift: mlops.ApiGetModelDriftMetricsResponse = (
mlops_client.model_monitoring.monitoring_service.get_model_drift_metrics(
deployment_id=selected_deployment.id,
model_id='xxxx',
start_date_time=start_time,
end_date_time=end_time,
)
)
print(f"Deployment drift metrics: {drift}")
- Submit and view feedback for this page
- Send feedback about H2O MLOps to cloud-feedback@h2o.ai