Databricks Terraform - Create secret scope backed connection

102 Views Asked by At

I'm deploying my Databricks Workspace resources using Terraform. I want to create a catalog that points to SQL Server using Lakehouse Federation feature. I'd like to reference the user and password as pointers to secret scope backed by Azure Key Vault. It's possible using SQL scripts, but found no docs on how to do it with Terraform. Have you guys have any idea how the code should look like?

What I want to replicate

CREATE CONNECTION sqlserver_connection TYPE sqlserver
  host '<hostname>',
  port '<port>',
  user secret ('<secret-scope>','<secret-key-user>'),
  password secret ('<secret-scope>','<secret-key-password>')
)

in Terraform

resource "databricks_connection" "sqlserver" {
  name            = "sqlserver_connection"
  connection_type = "SQLSERVER"
  options = {
    host     = "test.database.windows.net"
    port     = "1433"
    user     = "user" // a reference to secret scope
    password = "password" // as above
  }
}

I'm aware it's still in Public Preview though.

Have tried all the docs I've been able to find, REST API documentation, even tried to check to Github repo for Databricks provider. Sadly to no avail.

1

There are 1 best solutions below

0
Venkat V On

I'd like to reference the user and password as pointers to secret scope backed by Azure Key Vault. It's possible using SQL scripts.

To connect to SQL Server from Azure Databricks without hardcoding the SQL username and password, you can create secrets in Azure Key Vault to store the username and password. Then, you can use a data block to retrieve the SQL username and password.

Here is the update to connect to a SQL database from azure databricks.

    terraform {
      required_providers {
        azurerm = {
          source = "hashicorp/azurerm"
        }
        databricks = {
          source = "databricks/databricks"
        }
      }
    }
    
    provider "azurerm" {
      features {}
    }
    
    data "azurerm_key_vault" "example" {
      name                = "sqldbvault"
      resource_group_name = "Venkat"
    }
    
    data "azurerm_key_vault_secret" "username" {
      name         = "username"
      key_vault_id = data.azurerm_key_vault.example.id
    }
    
    data "azurerm_key_vault_secret" "password" {
      name         = "password"
      key_vault_id = data.azurerm_key_vault.example.id
    }
    
    resource "azurerm_resource_group" "rgname" {
      name     = "Venkat"
      location = "eastus"
    }
    
    resource "azurerm_databricks_workspace" "example" {
      name                = "venkat-databricks"
      resource_group_name = azurerm_resource_group.rgname.name
      location            = azurerm_resource_group.rgname.location
      sku                 = "premium"
    }
    
    provider "databricks" {
      host = azurerm_databricks_workspace.example.workspace_url
    }
    
    resource "azurerm_databricks_access_connector" "unity" {
      name                = "venkatdatabricksmi"
      resource_group_name = azurerm_resource_group.rgname.name
      location            = azurerm_resource_group.rgname.location
      identity {
        type = "SystemAssigned"
      }
    }
    
    resource "azurerm_storage_account" "unity_catalog" {
      name                     = "thejadatabricksdemo"
      resource_group_name      = azurerm_resource_group.rgname.name
      location                 = azurerm_resource_group.rgname.location
      account_tier             = "Standard"
      account_replication_type = "GRS"
      is_hns_enabled           = true
    }
    
    resource "azurerm_storage_container" "unity_catalog" {
      name                  = "venkat-container"
      storage_account_name  = azurerm_storage_account.unity_catalog.name
      container_access_type = "private"
    }
    
    resource "azurerm_role_assignment" "example" {
      scope                = azurerm_storage_account.unity_catalog.id
      role_definition_name = "Storage Blob Data Contributor"
      principal_id         = azurerm_databricks_access_connector.unity.identity[0].principal_id
    }
    
    resource "databricks_metastore" "this" {
      name         = "venkat-metastore"
      storage_root = format("abfss://%s@%s.dfs.core.windows.net/",
        azurerm_storage_container.unity_catalog.name,
        azurerm_storage_account.unity_catalog.name)
      force_destroy = true
      region        = "centralus"
    }
    
    resource "databricks_metastore_assignment" "this" {
      provider             = databricks
      workspace_id         = azurerm_databricks_workspace.example.workspace_id
      metastore_id         = databricks_metastore.this.id
      default_catalog_name = "hive_metastore"
    }
    
    resource "databricks_metastore_data_access" "this" {
      metastore_id = databricks_metastore.this.id
      name         = "mi_dac"
      azure_managed_identity {
        access_connector_id = azurerm_databricks_access_connector.unity.id
      }
      is_default = true
      depends_on = [databricks_metastore_assignment.this]
    }
    
    resource "databricks_connection" "sqlserver" {
      name            = "sqlserver_connection"
      connection_type = "SQLSERVER"
      options = {
        host     = "<sql-server-name>.database.windows.net"
        port     = "1433"
        user     = data.azurerm_key_vault_secret.username.name
        password = data.azurerm_key_vault_secret.password.name
      }
    }

Terraform apply:

After running the code, SQL connection has been established successfully from data bricks workspace.

enter image description here

Refer: Stack link answered by me