Skip to content

Latest commit

 

History

History
64 lines (47 loc) · 3.76 KB

sql_global_config.md

File metadata and controls

64 lines (47 loc) · 3.76 KB
subcategory
Databricks SQL

databricks_sql_global_config Resource

This resource configures the security policy, databricks_instance_profile, and data access properties for all databricks_sql_endpoint of workspace. Please note that changing parameters of this resources will restart all running databricks_sql_endpoint. To use this resource you need to be an administrator.

Example usage

AWS example

resource "databricks_sql_global_config" "this" {
  security_policy      = "DATA_ACCESS_CONTROL"
  instance_profile_arn = "arn:...."
  data_access_config = {
    "spark.sql.session.timeZone" : "UTC"
  }
}

Azure example

For Azure you should use the data_access_config to provide the service principal configuration. You can use the Databricks SQL Admin Console UI to help you generate the right configuration values.

resource "databricks_sql_global_config" "this" {
  security_policy      = "DATA_ACCESS_CONTROL"
  data_access_config = {
    "spark.hadoop.fs.azure.account.auth.type" : "OAuth",
    "spark.hadoop.fs.azure.account.oauth.provider.type" : "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
    "spark.hadoop.fs.azure.account.oauth2.client.id" : "${var.tenant_id}",
    "spark.hadoop.fs.azure.account.oauth2.client.secret" : "{{secrets/${local.secret_scope}/${local.secret_key}}}",
    "spark.hadoop.fs.azure.account.oauth2.client.endpoint" : "https://login.microsoftonline.com/${var.tenant_id}/oauth2/token"
  }
}

Argument Reference

The following arguments are supported (see documentation for more details):

  • security_policy (Optional, String) - The policy for controlling access to datasets. Default value: DATA_ACCESS_CONTROL, consult documentation for list of possible values
  • data_access_config (Optional, Map) - data access configuration for databricks_sql_endpoint, such as configuration for an external Hive metastore, Hadoop Filesystem configuration, etc. Please note that the list of supported configuration properties is limited, so refer to the documentation for a full list. Apply will fail if you're specifying not permitted configuration.
  • instance_profile_arn (Optional, String) - databricks_instance_profile used to access storage from databricks_sql_endpoint. Please note that this parameter is only for AWS, and will generate an error if used on other clouds.

Import

You can import a databricks_sql_global_config resource with command like the following (you need to use global as ID):

$ terraform import databricks_sql_global_config.this global

Related Resources

The following resources are often used in the same context: