Release notes
Version 0.65.1 (May 25, 2024)
This release is a minor release on top of v0.65.0 with the storage and telemetry features rebuilt using the latest zlib.
Version 0.65.0 (May 08, 2024)
This release includes new features, improvements, bug fixes, and security improvements.
New features
-
Added the capability to disable model monitoring features when deploying H2O MLOps. Set the
monitoring_enabled
installation parameter tofalse
to disable the following:- monitoring service
- monitoring task
- drift worker
- drift trigger
- InfluxDB
- RabbitMQ
infoNote that setting the
monitoring_enabled
parameter tofalse
also disables health checks for the monitoring backend. -
Support for FEDRAMP compliance.
-
Added an option to restrict model imports to specific types.
-
Added support for vLLM config model types.
-
When creating a deployment, added a deployment multi-issuer token security option.
-
The
ListProjects
API now returns all projects for admin users. -
You can now set a specific timeout only for external registry import API.
-
You can now upload LLM experiments using the MLOps Wave app.
-
Added support for Authz user format.
Improvements
-
By default, the model monitoring page is now sorted by deployment name.
-
You can now configure the supported experiment types for the upload experiment flow.
Bug fixes
- Fixed incorrect sorting in the
listMonitoredDeployments
API response.
Version 0.64.0 (April 08, 2024)
New features
-
MLflow Dynamic Runtime: Added support for Python 3.10.
-
Added support for DAI 1.10.7 and 1.10.6.2 runtimes.
-
Upgraded Rest scorer to Spring Boot 3 (1.2.0).
-
Added vLLM runtime support.
-
When creating a new deployment, added an option to disable monitoring for the deployment. For more information, see Endpoint security.
-
Added validation for experiment file uploading.
-
Extended scoring API with new endpoint
/model/media-score
to support uploading multiple media files. -
The H2O Hydrogen Torch runtime is now supported with the ability to score image and audio files against the new endpoint
/model/media-score
. -
The project page now includes an Events tab with pagination, search, and sorting. For more information, see Project page tabs.
-
You can now delete experiments.
-
Added pagination, search, sorting, and filtering by Tag on the Experiments page.
-
The Create Deployment workflow now automatically populates K8s limits and requests with the suggested default settings.
-
The deployment state is now updated dynamically on the Deployments page.
-
Additional details about error deployment states are now displayed in the MLOps UI.
-
You can now update and delete tags. Note that tags can only be deleted if they are not associated with any entity.
Improvements
- You can now edit the GPU request/limit fields.
- When creating a deployment, improved automatic population of the Kubernetes resource requests and limits fields in the UI based on the selected runtime and artifact type.
- H2O Driverless AI versions are now automatically identified when DAI models are uploaded through the Wave app or Python client.
- The deployment overview now displays additional details about errored deployment states.
Version 0.62.5
In addition to the changes included in the 0.64.0 release, this release includes the following changes:
Improvements
- The Deployer API now lets you create and update deployment settings related to what monitoring data you want to save. For example:
deployment.monitor_disable = True
deployment.store_scoring_transaction_disable = True
deployment = mlops_client.deployer.deployment.update_model_deployment(
mlops.DeployUpdateModelDeploymentRequest(deployment=deployment)
)
Changes
- Model Monitoring is now disabled by default for new deployments.
Known issues
-
Monitoring settings can only be modified using the Python client, regardless of whether they were initially set via the UI or Python client.
-
H2O MLOps version 0.62.5 cannot be upgraded to version 0.64.0. Upgrades from this version can only be made to version 0.65.0 and later.
Version 0.62.4
Improvements
- Various security improvements to address XSS security issues.
Version 0.62.1
New features
- You can now use the
ListExperiments
API to filter experiments by status (ACTIVE, DELETED). By default, the API returns ACTIVE experiments.
Improvements
-
Added support for the DAI 1.10.6.1 runtime.
-
Added pagination support in the Experiments page.
Bug fixes
-
Fixed an issue where uploading large artifacts (above 40GB) resulted in an error.
-
Fixed an issue where a registered model with the same name as a deleted model could not be created.
Announcements
-
The URL link to the legacy H2O MLOps app has been removed.
-
The legacy H2O MLOps app is no longer installed by default.
Version 0.62.0 (September 10, 2023)
New features
-
For GPU-enabled model deployments, you can now set the appropriate Kubernetes (K8s) requests and limits by clicking the GPU Deployment toggle when creating a deployment. For more information, see Deploy a model and Kubernetes options.
-
You can now create and assign experiment tags within a project. For more information, see Project page tabs and Add experiments.
-
You can now edit the names and tags of experiments. For more information, see Project page tabs.
Improvements
-
-
The default view when viewing projects has been changed from the grid view to the list view.
-
The Project ID of each project is now displayed in the list view.
-
The list view now features pagination, sorting, and search capabilities.
-
You can now search for a project by project name.
-
You can now sort the list of projects by time of creation and last modified time.
-
-
Project list view actions: You can now view, share, and delete projects from the project list view. For more information, see List view actions.
-
Improved UI for project sharing.
-
Enhanced the Deployment Overview window to include Kubernetes settings and deployed model details across all deployment modes. For more information, see Understand the Deployment Overview window.
-
Python client:
-
You can now enable or disable model monitoring for a deployment.
-
You can now update the deployment security option or password.
-
You can now delete experiments.
-
You can now delete Registered Model and Model Version.
-
-
Scoring:
-
Prediction intervals are now supported for MOJOs and Driverless AI Python scoring pipelines. Prediction intervals provide a range within which the true value is expected to fall with a certain level of confidence. You can check if prediction intervals are supported by using the
https://model.{domain}/{deployment}/capabilities
endpoint. -
Added a new MLflow Dynamic Runtime to dynamically resolve the various model dependencies in your MLflow model. For more information, see MLflow Dynamic Runtime.
Bug fixes
-
Fixed an issue where the passphrase field could not be edited when creating a secured deployment.
-
Fixed an issue that affected accurate sorting when using the sort by date functionality.
Version 0.61.1 (June 25, 2023)
Improvements
-
Added support for Kubernetes 1.25.
-
Added support for H2O Driverless AI version 1.10.5.
Bug fixes
- Various bug fixes to the deployment pipeline, monitoring, and drift detection.
Version 0.61.0 (May 24, 2023)
New features
- You can now create A/B Test and Champion/Challenger deployments through the UI. For more information, see Deploy a model.
- You can now create and view configurable scoring endpoints through the UI. For more information, see Configure scoring endpoint.
- Concurrent Scoring Requests are now supported for Python-based Scorers. Scoring times for for C++ MOJO, Scoring Pipeline, and MLflow types now support parallelization with the default degree of parallelization set to 2. This can be changed with the environment variable
H2O_SCORER_WORKERS
. For more details, contact your H2O representative.
Improvements
- Added support for H2O-3 MLflow Flavors and importing of MLflow-wrapped H2O-3 models.
Version 0.60.1 (April 02, 2023)
New features
- Introduced a feature flag to enable the import third-party experiments (pickled experiments) flow with Conda. If you require Conda or third-party pickle import, this flag needs to be set at the time H2O MLOps is installed to continue using pickled experiments. For more information about enabling this feature flag when installing or upgrading H2O MLOps, contact support@h2o.ai.
Improvements
-
You can now search for users by username when sharing a project with another user. You can now also sort the user list in alphabetical order.
-
In the model monitoring feature summary table, the figures are now displayed only up to three decimals places.
-
When no deployment name is present for the deployment, the deployment ID is now displayed as the name.
-
A blocking error page is now shown to the user in case Keycloak is unavailable.
-
Date and time are now both displayed for model monitoring predictions over time plot.
-
Storage Telemetry now includes the additional fields Deployment Name and model version number.
Bug fixes
-
Fixed a bug that caused experiments to fail during upload / ingestion.
-
All dialogs in the UI can now can be closed with the escape key.
-
Fixed a bug where drift was not previously calculated when a feature was determined to be a datetime type and the date time format was missing.
Version 0.59.1
Improvements
- Added support for the DAI 1.10.4.3 runtime.
Version 0.59.0 (February 12, 2023)
New features
- Storage telemetry: MLOps can now send analytical data related to storage operations to the telemetry server.
- Scoring telemetry: MLOps Scoring now sends scoring-related data to the telemetry server.
- Static scoring endpoints: You are now able to define and update a persistent URL that points to a particular MLOps deployment.
- Deployment:
- Deployed scoring applications now set additional Kubernetes annotations.
- Deployment APIs now return more accurate and useful gRPC status codes and error messages.
- You can now download Kubernetes logs from deployments in the MLOps Wave App and MLOps API.
Improvements
- Upgraded the
h2o-wave
version to 0.24.1. - Added support for the DAI 1.10.4.1 and DAI 1.10.4.2 runtimes.
- Updated the Python client.
- Added a cleanup task for files uploaded to the wave server.
- Updated the eScorer URL of the wave app deployment pipeline
- Added a new Kubernetes limit for the Hydrogen Torch runtime in the deployment creation flow.
Bug fixes
- Removed the custom implementation for the token provider.
- Removed the
artifact-id
from theDeployDeploymentComposition
endpoint. - Updated the packages in the base docker image.
- Fixed an issue related to displaying the session timeout page for deployment overview, view monitoring, and monitoring homepage.
- Fixed an issue where the drift detection trigger blocked the other calculations by adding timeout support to the InfluxDB client in trigger and worker.
Version 0.58.0 (December 15, 2022)
Improvements
- Added support for Kubernetes 1.23.
- Added support for H2O-3 MOJOs up to version
3.38.0.3
. - Added support for linking and deploying H2O Driverless AI unsupervised models.
- Added support for scoring H2O Driverless AI MOJOs with the C++ MOJO runtime.
- Added support for TTA for H2O Driverless AI Python pipelines.
- Shapley values can now be calculated for H2O Driverless AI Python pipelines and MOJOs.
- Datetime columns for H2O Driverless AI models are now automatically detected.
- Fixed an issue where the Driverless AI Python Pipeline scorer occasionally restarted randomly.
- Updated ML Python packages in the standard Python scorer to support a wider range of custom user models.
- BYOM scoring:
- Extended the Python scoring library to conform to v1.2.0 of the Scoring API.
- Unexpected input fields are now ignored when performing scoring.
- Introduced a feature that lets scorers override sample requests.
- Implemented an experimental API for image and file scoring.
- Replaced time-based handling of signals coming from Driverless AI scoring processes with static handling.
- Added a Driverless AI MOJO Pipeline artifact processor image.
- Added an H2O-3 artifact processor image.
- Updated the DAI pipeline processor dependencies to address security vulnerabilities.
Documentation
-
Added a page that describes support for Test Time Augmentation (TTA) in H2O MLOps.
-
Added several new Python client examples.
-
Updated the page on Deploying a model.
Version 0.57.3 (November 16, 2022)
New features
- You can now view monitoring dashboards for deployments directly through H2O MLOps. For more information, see Model monitoring.
Version 0.57.2 (August 01, 2022)
New features
-
When browsing the MLflow directory, you can now search for specific MLflow models by name. Note that this search functionality is case sensitive, and that the model name can contain only letters, numbers, spaces, hyphens, and underscores up to 100 characters.
-
When browsing the MLflow directory, the list of MLflow models is now organized into pages. You can specify the number of models listed on each page.
Bug fixes
- Fixed an issue where MLflow models could not be reimported.
Version 0.56.1 (May 16, 2022)
New features
- Azure access tokens can now be retrieved through H2O MLOps.
Improvements
-
When creating a deployment, only deployable artifacts are now shown.
-
Added Driverless AI (DAI) 1.10.2 and 1.10.3 as recognized versions of DAI for matching with DAI runtimes.
-
H2O MLOps now displays either a success or error message when attempting to create a deployment.
-
The process of linking models to an experiment is now simpler.
-
H2O MLOps can now handle large text fields.
-
Updated the H2O MLOps logo.
-
Removed scroll bars in overview UI pages.
Bug fixes
-
Fixed an issue that caused alignment issues between project cards.
-
Underscores can now be used at the beginning of project names.
-
Fixed an issue that caused H2O MLOps to crash when the deployer was restarted.
-
Fixed an issue related to adding new comments to an experiment.
Version 0.56.0 (April 18, 2022)
New features
-
Added support for batch scoring. For more information, see Deploying a model.
-
Added support for H2O-3 MOJOs up to version
3.32.0.2
.
Version 0.55.0 (March 31, 2022)
New features
- Added support for integration with MLflow Model Registry.
- Admin users can now monitor H2O MLOps usage within their organization with Admin Analytics.