Skip to content

Deployments

Deployments

Interact with deployments in the Driverless AI server.

deploy_to_triton_in_local

deploy_to_triton_in_local(
    experiment: Experiment,
    deploy_predictions: bool = True,
    deploy_shapley: bool = False,
    deploy_original_shapley: bool = False,
    enable_high_concurrency: bool = False,
) -> TritonDeployment

Deploys the model created from an experiment to the local Triton server in the Driverless AI server.

Parameters:

  • experiment (Experiment) –

    Experiment model.

  • deploy_predictions (bool, default: True ) –

    Whether to deploy model predictions or not.

  • deploy_shapley (bool, default: False ) –

    Whether to deploy model Shapley or not.

  • deploy_original_shapley (bool, default: False ) –

    Whether to deploy model original Shapley or not.

  • enable_high_concurrency (bool, default: False ) –

    Whether to enable handling several requests at once

Returns:

Beta API

A beta API that is subject to future changes.

deploy_to_triton_in_remote

deploy_to_triton_in_remote(
    experiment: Experiment,
    deploy_predictions: bool = True,
    deploy_shapley: bool = False,
    deploy_original_shapley: bool = False,
    enable_high_concurrency: bool = False,
) -> TritonDeployment

Deploys the model created from an experiment to a remote Triton server configured in the Driverless AI server.

Parameters:

  • experiment (Experiment) –

    Experiment model.

  • deploy_predictions (bool, default: True ) –

    Whether to deploy model predictions or not.

  • deploy_shapley (bool, default: False ) –

    Whether to deploy model Shapley or not.

  • deploy_original_shapley (bool, default: False ) –

    Whether to deploy model original Shapley or not.

  • enable_high_concurrency (bool, default: False ) –

    Whether to enable handling several requests at once

Returns:

Beta API

A beta API that is subject to future changes.

get_from_triton_in_local

get_from_triton_in_local(key: str) -> TritonDeployment

Retrieves a Triton deployment, deployed in the local Triton server configured in the Driverless AI server.

Parameters:

  • key (str) –

    The unique ID of the Triton deployment.

Returns:

get_from_triton_in_remote

get_from_triton_in_remote(key: str) -> TritonDeployment

Retrieves a Triton deployment, deployed in a remote Triton server configured in the Driverless AI server.

Parameters:

  • key (str) –

    The unique ID of the Triton deployment.

Returns:

gui

gui() -> Hyperlink

Returns the full URL to the Deployments page in the Driverless AI server.

Returns:

  • Hyperlink

    The full URL to the Deployments page.

list_triton_deployments

list_triton_deployments(
    start_index: int = 0, count: int = None
) -> Sequence[TritonDeployment]

Retrieves Triton deployments in the Driverless AI server.

Parameters:

  • start_index (int, default: 0 ) –

    The index of the first Triton deployment to retrieve.

  • count (int, default: None ) –

    The maximum number of Triton deployments to retrieve. If None, retrieves all available Triton deployments.

Returns:

Beta API

A beta API that is subject to future changes.

TritonDeployment

A deployment in an NVIDIA Triton inference server in the Driverless AI server.

is_local_deployment property

is_local_deployment: bool

Whether the Triton deployment is in the built-in (local) Triton server in the Driverless AI server or in a remote Triton server.

Returns:

key property

key: str

Universally unique key of the entity.

Returns:

name property

name: str

Name of the entity.

Returns:

state property

state: str

Current state of the Triton deployment.

Returns:

triton_model property

triton_model: TritonModel

Triton model created by the Triton deployment.

Beta API

A beta API that is subject to future changes.

Returns:

triton_server_hostname property

triton_server_hostname: str

Hostname of the Triton server in which the Triton deployment occurred.

Returns:

delete

delete() -> None

Permanently deletes the Triton deployment from the Driverless AI server.

Beta API

A beta API that is subject to future changes.

load

load() -> None

Load the Triton deployment.

Beta API

A beta API that is subject to future changes.

unload

unload() -> None

Unload the Triton deployment.

Beta API

A beta API that is subject to future changes.

TritonModel dataclass

A Triton model created by a Triton deployment.

inputs instance-attribute

inputs: list[str]

Inputs of the Triton model.

Returns:

name instance-attribute

name: str

Name of the Triton model.

Returns:

outputs instance-attribute

outputs: list[str]

Outputs of the Triton model.

Returns:

platform instance-attribute

platform: str

Supported platform of the Triton model.

Returns:

versions instance-attribute

versions: list[str]

Versions of the Triton model.

Returns: