Skip to content

Get drift metrics

This example demonstrates how you can get the drift metrics for a deployed model by using the model monitoring service of the MLOps API. This provides metrics such as drift, drift threshold, and importance of each feature of the model, and categorizes the feature drifts as low, medium, or high. These drift metrics can then be used to evaluate the model drift. Currently, only single model deployments are supported by the model monitoring service.

Before you begin

You will need the values for the following constants in order to successfully carry out the task. Contact your administrator to obtain deployment specific values.

Value Description
MLOPS_API_URL Usually: Defines the URL for the MLOps Gateway component.
TOKEN_ENDPOINT_URL https://mlops.keycloak.domain/auth/realms/[fill-in-realm-name]/protocol/openid-connect/token Defines the token endpoint URL of the Identity Provider. This uses Keycloak as the Identity Provider. Keycloak Realm should be provided.
REFRESH_TOKEN <your-refresh-token> Defines the user's refresh token
CLIENT_ID <your-client-id> Sets the client id for authentication. This is the client you will be using to connect to MLOps.
CLIENT_SECRET <your-client-secret> Sets the client secret.
DEPLOYMENT_ID <your-deployment-id> Defines a deployment id that the script will be using.

The following steps demonstrate how you can use the MLOps Python client to get the drift metrics for a deployed model.

  1. Download the file.

  2. Change the values of the following constants in your file as given in the preceding data table.
    ### Constants
    ### Constants
    MLOPS_API_URL = ""
    CLIENT_SECRET = "<your-client-secret>"
    DEPLOYMENT_ID = "f9fa4db1-2f30-4b10-ace2-f383a9f74880"   
  3. Run the file.

  4. This provides the drift metrics for the model of the specified deployment.

    Deployment drift metrics: {'count_feature_drift': 13,
    'feature_frequency': [{'categorical': {'description': '',
                                            'name': '',
                                            'point': [],
                                            'x_label': '',
                                            'y_label': ''},
                            'continuous': {'bin': [{'x_max': '13.0',
                                                    'x_min': '0.0',
                                                    'y': '0.1282399547255235'},
    'feature_importance': {'drift_threshold': 0.2,
                        'feature_drift': [{'drift': [{'value': '',
                                                    'x_axis': '2022-08-17T08:49:53Z',
                                                    'y_axis': '0.03827751196172248'},
     'feature_summary': {'feature': [{'data_type': 'double',
                                  'drift': 0.32509654856645576,
                                  'impact': 0.0,
                                  'importance': 0.0,
                                  'missing_values': 0,
                                  'name': 'exang'},
    'high_feature_drift': 0,
    'low_feature_drift': 0,
    'medium_feature_drift': 0}

Example walkthrough

This section provides a walkthrough of the file.

  1. Set up the token provider using an existing refresh token and client secret.

  2. Set up the MLOps client.

  3. List all the monitored deployments of the user by calling the list_monitored_deployments endpoint of the model monitoring service.
    deployments: mlops.ApiListMonitoredDeploymentsResponse = (
  4. Select the specified deployment from the list of all monitored deployments by using the defined DEPLOYMENT_ID.
    for deployment in deployments.deployment:
        if == DEPLOYMENT_ID:
            selected_deployment = deployment
        raise LookupError("Requested project not found")
  5. Define the start datetime and end datetime to get the drift metrics. In this example, the datetime range is defined as 30 days.
    end_time =
    start_time = end_time - datetime.timedelta(days=30)
  6. Finally, call the get_model_drift_metrics endpoint of the model monitoring service to get the drift metrics of the model.


    A deployment can have multiple models with A/B Test deployment and Champion/ Challenger deployment types. However, only single deployments are currently supported by the model monitoring service. Therefore, the model_id parameter is ignored for now, and the API returns the drift metrics for the first available model of the deployment.
    drift: mlops.ApiGetModelDriftMetricsResponse = (
    print(f"Deployment drift metrics: {drift}")