Installing Driverless AI¶
For the best (and intended-as-designed) experience, install Driverless AI on modern data center hardware with GPUs and CUDA support. Use Pascal or Volta GPUs with maximum GPU memory for best results. (Note that the older K80 and M60 GPUs available in EC2 are supported and very convenient, but not as fast.)
Driverless AI requires 10 GB of free disk space to run and will stop working if less than 10 GB is available. You should have lots of system CPU memory (64 GB or more) and free disk space (at least 30 GB and/or 10x your dataset size) available.
To simplify cloud installation, Driverless AI is provided as an AMI. To simplify local installation, Driverless AI is provided as a Docker image. For the best performance, including GPU support, use nvidia-docker. For a lower-performance experience without GPUs, use regular docker (with the same docker image).
Driverless AI supports HDFS and S3 access. Refer to the Data Connectors section for more information on how to start Driverless AI with authentication. Or if preferred, a config.toml can instead be referenced when starting Driverless AI. Refer to The Config.toml File section for more information.
Driverless AI also supports basic, LDAP, and PAM authentication, which admins can configure via a config.toml file. Refer to The Config.toml File section to view the properties to set up.
These installation steps assume that you have a license key for Driverless AI. For information on how to obtain a license key for Driverless AI, contact email@example.com.