BlueData DataTap Setup¶
This section provides instructions for configuring Driverless AI to work with BlueData DataTap.
Description of Configuration Attributes¶
dtap_auth_type
: Selects DTAP authentication. Available values are:noauth
: No authentication neededprincipal
: Authenticate with DataTap with a principal userkeytab
: Authenticate with a Key tab (recommended). If running Driverless AI as a service, then the Kerberos keytab needs to be owned by the Driverless AI user.keytabimpersonation
: Login with impersonation using a keytab
dtap_config_path
: The location of the DTAP (HDFS) config folder path. This folder can contain multiple config files. Note: The DTAP config file core-site.xml needs to contain DTap FS configuration, for example:<configuration> <property> <name>fs.dtap.impl</name> <value>com.bluedata.hadoop.bdfs.Bdfs</value> <description>The FileSystem for BlueData dtap: URIs.</description> </property> </configuration>
dtap_key_tab_path
: The path of the principal key tab file. For use whendtap_auth_type=principal
.dtap_app_principal_user
: The Kerberos app principal user (recommended).dtap_app_login_user
: The user ID of the current user (for example, user@realm).dtap_app_jvm_args
: JVM args for DTap distributions. Separate each argument with spaces.dtap_app_classpath
: The DTap classpath.
DataTap with No Authentication¶
This example enables the DataTap data connector and disables authentication in the config.toml file. This allows users to reference data stored in DataTap directly using the name node address, for example: dtap://name.node/datasets/iris.csv
or dtap://name.node/datasets/
. (Note: The trailing slash is currently required for directories.)
- Export the Driverless AI config.toml file or add it to ~/.bashrc. For example:
# DEB and RPM export DRIVERLESS_AI_CONFIG_FILE="/etc/dai/config.toml" # TAR SH export DRIVERLESS_AI_CONFIG_FILE="/path/to/your/unpacked/dai/directory/config.toml"
- Edit the following environment variables in the config.toml file.
# File System Support # upload : standard upload feature # file : local file system/server file system # hdfs : Hadoop file system, remember to configure the HDFS config folder path and keytab below # dtap : Blue Data Tap file system, remember to configure the DTap section below # s3 : Amazon S3, optionally configure secret and access key below # gcs : Google Cloud Storage, remember to configure gcs_path_to_service_account_json below # gbq : Google Big Query, remember to configure gcs_path_to_service_account_json below # minio : Minio Cloud Storage, remember to configure secret and access key below # snow : Snowflake Data Warehouse, remember to configure Snowflake credentials below (account name, username, password) # kdb : KDB+ Time Series Database, remember to configure KDB credentials below (hostname and port, optionally: username, password, classpath, and jvm_args) # azrbs : Azure Blob Storage, remember to configure Azure credentials below (account name, account key) enabled_file_systems = "file, dtap"
- Save the changes when you are done, then stop/restart Driverless AI.
DataTap with Keytab-Based Authentication¶
This example:
- Places keytabs in the
/tmp/dtmp
folder on your machine and provides the file path as described below. - Configures the environment variable
dtap_app_prinicpal_user
to reference a user for whom the keytab was created (usually in the form of user@realm).
- Export the Driverless AI config.toml file or add it to ~/.bashrc. For example:
# DEB and RPM export DRIVERLESS_AI_CONFIG_FILE="/etc/dai/config.toml" # TAR SH export DRIVERLESS_AI_CONFIG_FILE="/path/to/your/unpacked/dai/directory/config.toml"
- Edit the following environment variables in the config.toml file.
# File System Support # upload : standard upload feature # file : local file system/server file system # hdfs : Hadoop file system, remember to configure the HDFS config folder path and keytab below # dtap : Blue Data Tap file system, remember to configure the DTap section below # s3 : Amazon S3, optionally configure secret and access key below # gcs : Google Cloud Storage, remember to configure gcs_path_to_service_account_json below # gbq : Google Big Query, remember to configure gcs_path_to_service_account_json below # minio : Minio Cloud Storage, remember to configure secret and access key below # snow : Snowflake Data Warehouse, remember to configure Snowflake credentials below (account name, username, password) # kdb : KDB+ Time Series Database, remember to configure KDB credentials below (hostname and port, optionally: username, password, classpath, and jvm_args) # azrbs : Azure Blob Storage, remember to configure Azure credentials below (account name, account key) enabled_file_systems = "file, dtap" # Blue Data DTap connector settings are similar to HDFS connector settings. # # Specify DTap Auth Type, allowed options are: # noauth : No authentication needed # principal : Authenticate with DTab with a principal user # keytab : Authenticate with a Key tab (recommended). If running # DAI as a service, then the Kerberos keytab needs to # be owned by the DAI user. # keytabimpersonation : Login with impersonation using a keytab dtap_auth_type = "keytab" # Path of the principal key tab file dtap_key_tab_path = "/tmp/<keytabname>" # Kerberos app principal user (recommended) dtap_app_principal_user = "<user@kerberosrealm>"
- Save the changes when you are done, then stop/restart Driverless AI.
DataTap with Keytab-Based Impersonation¶
The example:
- Places keytabs in the
/tmp/dtmp
folder on your machine and provides the file path as described below. - Configures the
dtap_app_principal_user
variable, which references a user for whom the keytab was created (usually in the form of user@realm). - Configures the
dtap_app_login_user
variable, which references a user who is being impersonated (usually in the form of user@realm).
- Export the Driverless AI config.toml file or add it to ~/.bashrc. For example:
# DEB and RPM export DRIVERLESS_AI_CONFIG_FILE="/etc/dai/config.toml" # TAR SH export DRIVERLESS_AI_CONFIG_FILE="/path/to/your/unpacked/dai/directory/config.toml"
- Edit the following environment variables in the config.toml file.
# File System Support # upload : standard upload feature # file : local file system/server file system # hdfs : Hadoop file system, remember to configure the HDFS config folder path and keytab below # dtap : Blue Data Tap file system, remember to configure the DTap section below # s3 : Amazon S3, optionally configure secret and access key below # gcs : Google Cloud Storage, remember to configure gcs_path_to_service_account_json below # gbq : Google Big Query, remember to configure gcs_path_to_service_account_json below # minio : Minio Cloud Storage, remember to configure secret and access key below # snow : Snowflake Data Warehouse, remember to configure Snowflake credentials below (account name, username, password) # kdb : KDB+ Time Series Database, remember to configure KDB credentials below (hostname and port, optionally: username, password, classpath, and jvm_args) # azrbs : Azure Blob Storage, remember to configure Azure credentials below (account name, account key) enabled_file_systems = "file, dtap" # Blue Data DTap connector settings are similar to HDFS connector settings. # # Specify DTap Auth Type, allowed options are: # noauth : No authentication needed # principal : Authenticate with DTab with a principal user # keytab : Authenticate with a Key tab (recommended). If running # DAI as a service, then the Kerberos keytab needs to # be owned by the DAI user. # keytabimpersonation : Login with impersonation using a keytab dtap_auth_type = "keytab" # Path of the principal key tab file dtap_key_tab_path = "/tmp/<keytabname>" # Kerberos app principal user (recommended) dtap_app_principal_user = "<user@kerberosrealm>" # Specify the user id of the current user here as user@realm dtap_app_login_user = "<user@realm>"
- Save the changes when you are done, then stop/restart Driverless AI.