This Terraform module allows for the creation and management of a Databricks cluster policy with support for setting permissions and looking up driver node types and runtime spark versions. It integrates with Terratest to ensure the robustness and reliability of your infrastructure.
The main objective is to create a more logic data structure, achieved by combining and grouping related resources together in a complex object.
The structure of the module promotes reusability. It's intended to be a repeatable component, simplifying the process of building diverse workloads and platform accelerators consistently.
A primary goal is to utilize keys and values in the object that correspond to the REST API's structure. This enables us to carry out iterations, increasing its practical value as time goes on.
A last key goal is to separate logic from configuration in the module, thereby enhancing its scalability, ease of customization, and manageability.
These modules are not intended to be complete, ready-to-use solutions; they are designed as components for creating your own patterns.
They are not tailored for a single use case but are meant to be versatile and applicable to a range of scenarios.
Security standardization is applied at the pattern level, while the modules include default values based on best practices but do not enforce specific security standards.
End-to-end testing is not conducted on these modules, as they are individual components and do not undergo the extensive testing reserved for complete patterns or solutions.
- provision Databricks cluster policies with ease using Terraform
- utilization of Terratest for robust validation
- set permissions for a Databricks cluster policy
- support for custom policy definition
- support for family policy with overrides
Name | Version |
---|---|
terraform | ~> 1.0 |
databricks | ~> 1.51 |
databricks workspace | ~> n/a |
Name | Version |
---|---|
databricks | ~> 1.51 |
Name | Type |
---|---|
databricks_cluster_policy | resource |
databricks_cluster_policy | data source |
databricks_permissions | resource |
Name | Description | Type | Required |
---|---|---|---|
policy |
describes databricks cluster related configuration | object | yes |
Name | Description |
---|---|
policy |
contains the databricks cluster details |
policy_permissions |
contains the databricks cluster permissions details |
As a prerequirement, please ensure that both go and terraform are properly installed on your system.
The Makefile includes two distinct variations of tests. The first one is designed to deploy different usage scenarios of the module. These tests are executed by specifying the TF_PATH environment variable, which determines the different usages located in the example directory.
To execute this test, input the command make test TF_PATH=default
, substituting default with the specific usage you wish to test.
The second variation is known as a extended test. This one performs additional checks and can be executed without specifying any parameters, using the command make test_extended
.
Both are designed to be executed locally and are also integrated into the github workflow.
Each of these tests contributes to the robustness and resilience of the module. They ensure the module performs consistently and accurately under different scenarios and configurations.
Using a dedicated module, we've developed a naming convention for resources that's based on specific regular expressions for each type, ensuring correct abbreviations and offering flexibility with multiple prefixes and suffixes.
Full examples detailing all usages, along with integrations with dependency modules, are located in the examples directory.
This Databricks cluster policy module has only been tested on Azure cloud, other cloud providers (AWS or GCP) have not been tested (yet). Therefore, in the examples a Databricks workspace host url is retrieved from the AzureRM provider and set to the Databricks provider.
Module is maintained by these awesome contributors.
We welcome contributions from the community! Whether it's reporting a bug, suggesting a new feature, or submitting a pull request, your input is highly valued.
For more information, please see our contribution guidelines.
MIT Licensed. See LICENSE for full details.