Skip to main content

Snowflake support

This page provides information on Snowflake integration along with Snowpark Container Services (SPCS).

Overview

Snowflake is a cloud-based data warehousing platform that utilizes a unique multi-cluster, shared data architecture, separating storage and compute resources. You can scale resources on-demand, and the platform supports SQL queries for data analysis. Snowflake prioritizes security with features like encryption and access controls. Its cloud-native design allows for efficient storage and analysis of large volumes of structured and semi-structured data.

Connect to the Snowflake Connector

Follow the steps below to connect and configure the Snowflake connector:

  1. To begin, navigate to the SNOWFLAKE option on the tab above and choose With Username and Password as the Authentication Method.
  2. Enter your Snowflake credentials, and click Connect. start-snowflake
  3. Choose the Role, Warehouse, Database, and Schema from the drop-down lists.

You have successfully connected to the Snowflake connector.

Deploy a model

To deploy a model, click on the Deployments option on the tab to deploy a model in one of two ways.

You can upload a pre-exisiting model for your deployment by choosing from the drop-down list. Click Create to proceed.
    Uploading a pre-existing model
info

Functions can be overloaded and can have different signatures across a common name. This means that a model for each individual function is not required.

You have successfully deployed a model.

Score a model

  1. To score your model, navigate to Score on the tab above and select the appropriate option for Deployment Type.

    info

    You can use Snowflake Java UDF for in-database execution, Snowflake Container Function for running code alongside SQL queries, or Snowflake External Functions for integration with external systems.

  2. Select the Model you wish to score and choose the relevant Table from the drop-down list.

  3. Next, click Generate Query. scoring

  4. Once the query has been generated, you can edit it before clicking on Run Query. run_score

You have successfully scored a model.

Worksheet

  1. You can run Snowflake SQL queries in eScorer by navigating to Worksheet on the tab, and entering your query in the given space.

  2. Click Run Query.

Operation monitoring

You can look at how your model has been deployed by navigating to the Operation Monitoring option on the tab. This gives you additional information on how your model is performing in regards to cost and latency inside of the Snowflake environment. Within this dashboard, you choose between Java UDFs, Container Services, or External Functions.

Snowflake key functionalities

H2O eScorer supports a multitude of functions to facilitate model deployment and scoring. The following is an overview of the key functions available:

  • Snowflake login and context switching UI: Facilitates Snowflake login and allows easy switching between roles, warehouses, databases, and schemas.

  • Snowflake container service: Enables creation, deletion, and log viewing of Snowflake Container Services with accessible logs through public endpoints.

  • Snowflake compute pool creation: Lets users create Snowflake compute pools for efficient resource allocation and workload management.

  • Deploy models: Provides the ability to deploy machine learning models as User-Defined Functions (UDF), Container Functions, or External Functions.

  • Deletion of deployed functions: Lets users remove previously deployed functions or models.

  • Scoring with Marco (VSCode) editor window: Facilitates modification and execution of scoring SQL queries using a Marco (VSCode) editor.

  • Monitoring charts: Includes visualizations for system load average, memory usage, active models, and the number of calls to models for effective system monitoring.

Snowpark Container Services support

Snowpark is a feature in Snowflake that allows developers to securely process non-SQL code (Python, Java, Scala) within Snowflake's elastic processing engine. By running services directly in Snowflake, it ensures quick and secure access to stored data, preventing data from leaving the platform. Snowpark simplifies management tasks through automated SQL queries, facilitated by the eScorer interface. SPCS enable the execution of containerized applications, utilizing key components like image repositories, compute pools, services, and jobs. Services, similar to web services, run continuously and restart if a container exits, while jobs have a finite lifespan. These components operate within a compute pool, a collection of virtual machine nodes.

  1. Click on Services to view all running services if you have any. You can also create a service or create a pool directly through the UI by navigating to the Create Service and Create Pool buttons. creating-service-and-pool

    You can create a new service by navigating to the Create Service button and entering your credentials. Click Create to proceed.
      Creating a new service
  2. You can access a Snowpark Container Service by clicking on a corresponding endpoint in the services page. spcs-login

    note

    Clicking on the vertical ellipsis under Action gives you multiple options. You can View, Delete, Suspend or Resume a service for the pool that it runs on.

You can now successfully create a new Service or Pool.


Feedback