Python client usage overview
Import the legacy client
The following example demonstrates the method for importing the legacy client.
import h2o_mlops # Allows h2o_mlops_client to be imported
import h2o_mlops_client as mlops
Alternative method
Alternatively, you can continue to access the legacy client's features through h2o-mlops
as needed. For more information, see Python client tutorials - Backend.
Connecting to MLOps with Python client
You will need the values for the following constants in order to successfully connect to MLOps. Contact your administrator to obtain deployment specific values.
Constant | Value | Description |
---|---|---|
H2O_MLOPS_GATEWAY | Usually: https://api.mlops.my.domain | Defines the URL for the MLOps Gateway component. You can verify the correct URL by navigating to the API URL in your browser. It should provide a page with a list of available routes. |
TOKEN_ENDPOINT | https://mlops.keycloak.domain/auth/realms/[fill-in-realm-name]/protocol/openid-connect/token | Defines the token endpoint URL of the Identity Provider. This uses Keycloak as the Identity Provider. Keycloak Realm should be provided. |
REFRESH_TOKEN | <your-refresh-token> | Defines the user's refresh token |
CLIENT_ID | <your-client-id> | Sets the client id for authentication. This is the client you will be using to connect to MLOps. |
The following steps demonstrate how you can connect to MLOps using the MLOps Python client.
Create a connection.py file and add the content below. Make sure to replace the variable values as given above.
- Format
- Sample
connection.pyimport h2o_mlops_client as mlops
from pprint import pprint
### Constants
H2O_MLOPS_GATEWAY = <H2O_MLOPS_GATEWAY>
TOKEN_ENDPOINT = <TOKEN_ENDPOINT>
REFRESH_TOKEN = <REFRESH_TOKEN>
CLIENT_ID = <CLIENT_ID>connection.pyimport h2o_mlops_client as mlops
from pprint import pprint
H2O_MLOPS_GATEWAY = "https://api.mlops.my.domain"
TOKEN_ENDPOINT_URL="https://mlops.keycloak.domain/auth/realms/[fill-in-realm-name]/protocol/openid-connect/token"
REFRESH_TOKEN="<your-refresh-token>"
CLIENT_ID="<your-mlops-client>"Add the following content to the same connection.py file in order to set up the token provider using the refresh token, set up the MLOps client, and send a WhoAmI request to MLOps which displays information about the current user.
connection.pydef main():
# Setting up the token provider using an existing refresh token.
mlops_token_provider = mlops.TokenProvider(
refresh_token=REFRESH_TOKEN,
client_id=CLIENT_ID,
token_endpoint_url=TOKEN_ENDPOINT_URL
)
# Setting up MLOPS client.
mlops_client = mlops.Client(
gateway_url=H2O_MLOPS_GATEWAY,
token_provider=mlops_token_provider
)
# Sending a WhoAmI request to MLOps.
r: mlops.StorageWhoAmIResponse = mlops_client.storage.user.who_am_i(body={})
pprint(r.user)
if __name__ == "__main__":
main()Run the connection.py file.
- Execution
- Sample Response
python3 connection.py
'created_time': datetime.datetime(2022, 7, 12, 16, 41, 13, 127190, tzinfo=tzutc()),
'id': '3b1b1294-0172-4229-a6a8-47ee9117fc7f',
'primary_email': '',
'username': <your_username>
Making API calls
API calls have their corresponding functions in the client, which can be found using the following pattern:
mlops_client.grpc_server_name.entity_type.method
The following is an example call that lists projects for the current user:
mlops_client.storage.project.list_projects()
The preceding example consists of the following:
mlops_client
A client objectstorage
The name of the platform gRPC server being contactedproject
The name of the entity typelist_projects
The name of the API method that is called
Available gRPC servers
storage
- The core service responsible for storing the platform's datadeployer
- The service used by the client for getting deployments' statusesingest
- The service responsible for ingesting modelsmodel-monitoring
- The service used by the client to get monitoring details of deployments
Client API calls and Request/Response objects
Due to the nature of the gRPC framework, gRPC calls expect their corresponding gRPC messages as input.
Because of this, client calls for the MLOps Python client consist of a specific API call and that call's request.
In the example below Storage's gRPC List Projects
call is being made using the mlops_client.storage.project.list_projects()
client call including the mlops.StorageListProjectsRequest
object after which the mlops.StorageListProjectsResponse
response is being returned.
projects: mlops.StorageListProjectsResponse = (
mlops_client.storage.project.list_projects(mlops.StorageListProjectsRequest())
)
All supported requests and responses are accessible directly through the MLOps client module.
Accessing a requested entity
In order to get a specific entity requested in an API call that entity needs to be accessed through a member with the same name as the entity. For API calls which return a collection of entities the same name is used to access the collection. Below are several examples of such calls and accessing their corresponding entities:
mlops_client.storage.experiment.get_experiment(
mlops.StorageGetExperimentRequest(...)
).experiment # accessing an experiment entity
mlops_client.storage.project.list_projects(
mlops.StorageListProjectsRequest()
).project # accessing a list of projects
mlops_client.storage.deployment.list_deployments(
mlops.StorageListDeploymentsRequest(...)
).deployment # accessing a list of deployments
Empty body calls
Some API calls don't have their own gRPC messages but instead expect an empty gRPC message. What's important
to note is that an empty gRPC message is not equal to a None
request. A good example of that is
the :ref:WhoAmI
example in which the who_am_i()
API call doesn't have a corresponding request class but instead
uses an empty JSON in the request body. The empty JSON is required for the JSON -> gRPC
translation process which would
otherwise fail.
r: mlops.StorageWhoAmIResponse = mlops_client.storage.user.who_am_i(body={})
Authentication
MLOps uses the OpenID Connect protocol for authentication. The client includes a helper class which helps with authentication and ensures that an active token is always available to the client.
mlops_token_provider = mlops.TokenProvider(...)
mlops_client = mlops.client(
...,
token_provider=mlops_token_provider
)
Artifact names mapping
The following table describes the mapping of artifact names.
Storage artifact name | deployable_artifact_type_name | Artifact processor name |
---|---|---|
dai/mojo_pipeline | dai_mojo_pipeline | dai_mojo_pipeline_extractor |
dai/scoring_pipeline | dai_python_scoring_pipeline | artifact-processor_dai_pipelines_193 |
h2o3/mojo | h2o3_mojo | h2o3_mojo_extractor |
python/mlflow | python/mlflow.zip | unzip_processor |
mlflow/mojo_pipeline | mlflow_mojo_pipeline | mlflow_mojo_pipeline_extractor |
mlflow/scoring_pipeline | mlflow_scoring_pipeline | mlflow_scoring_pipeline_extractor |
python/pickle | python/pickle | noop_processor |
Runtime names mapping
The following table describes the mapping of runtime names.
Model type | Model description | Human-readable runtime name | Runtime name |
---|---|---|---|
mlflow | MLFlow non-H2O.ai models created with Python 3.6 | MLflow Model Scorer [Python 3.6] | python-scorer_mlflow_36 |
mlflow | MLFlow non-H2O.ai models created with Python 3.7 | MLflow Model Scorer [Python 3.7] | python-scorer_mlflow_37 |
mlflow | MLFlow non-H2O.ai models created with Python 3.8 | MLflow Model Scorer [Python 3.8] | python-scorer_mlflow_38 |
mlflow | MLFlow non-H2O.ai models created with Python 3.9 | MLflow Model Scorer [Python 3.9] | python-scorer_mlflow_39 |
h2o3_mojo | H2O-3 MOJO models | H2O.ai MOJO scorer | h2o3_mojo_runtime |
dai_mojo | DAI MOJO models (Java runtime) | H2O.ai MOJO scorer | dai_mojo_runtime |
dai_mojo | DAI MOJO models (Java runtime) - with Shapley contributions for original features | DAI MOJO Scorer (Shapley original only) | mojo_runtime_shapley_original |
dai_mojo | DAI MOJO models (Java runtime) - with Shapley contributions for transformed features | DAI MOJO Scorer (Shapley transformed only) | mojo_runtime_shapley_transformed |
dai_mojo | DAI MOJO models (Java runtime) - with Shapley contributions for both original and transformed features | DAI MOJO Scorer (Shapley all) | mojo_runtime_shapley_all |
dai_mojo | DAI MOJO models (C++ runtime) - supports all Shapley contribution types and is expected to have significantly lower memory usage | DAI MOJO Scorer (C++ Runtime) | dai-mojo-cpp_experimental |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.9.3 | Python Pipeline Scorer [DAI 1.9.3] | python-scorer_dai_pipelines_193 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.0 | Python Pipeline Scorer [DAI 1.10.0] | python-scorer_dai_pipelines_110 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.1 | Python Pipeline Scorer [DAI 1.10.1] | python-scorer_dai_pipelines_1101 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.2 | Python Pipeline Scorer [DAI 1.10.2] | python-scorer_dai_pipelines_1102 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.3 | Python Pipeline Scorer [DAI 1.10.3] | python-scorer_dai_pipelines_1103 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.3.1 | Python Pipeline Scorer [DAI 1.10.3.1] | python-scorer_dai_pipelines_11031 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.3.2 | Python Pipeline Scorer [DAI 1.10.3.2] | python-scorer_dai_pipelines_11032 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.4 | Python Pipeline Scorer [DAI 1.10.4] | python-scorer_dai_pipelines_1104 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.4.1 | Python Pipeline Scorer [DAI 1.10.4.1] | python-scorer_dai_pipelines_11041 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.4.2 | Python Pipeline Scorer [DAI 1.10.4.2] | python-scorer_dai_pipelines_11042 |
dai_python_scoring_pipeline | DAI Python Scoring Pipeline models created by DAI 1.10.4.3 | Python Pipeline Scorer [DAI 1.10.4.3] | python-scorer_dai_pipelines_11043 |
- Submit and view feedback for this page
- Send feedback about H2O MLOps to cloud-feedback@h2o.ai