subcategory |
---|
Databricks SQL |
-> Note If you have a fully automated setup with workspaces created by databricks_mws_workspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.
Retrieves information about a databricks_sql_warehouse using its id. This could be retrieved programmatically using databricks_sql_warehouses data source.
- Retrieve attributes of each SQL warehouses in a workspace:
data "databricks_sql_warehouses" "all" {
}
data "databricks_sql_warehouse" "all" {
for_each = data.databricks_sql.warehouses.ids
id = each.value
}
- Search for a specific SQL Warehouse by name:
data "databricks_sql_warehouse" "all" {
name = "Starter Warehouse"
}
id
- (Required, ifname
isn't specified) The ID of the SQL warehouse.name
- (Required, ifid
isn't specified) Name of the SQL warehouse to search (case-sensitive).
This data source exports the following attributes:
-
id
- The ID of the SQL warehouse. -
name
- Name of the SQL warehouse. -
cluster_size
- The size of the clusters allocated to the warehouse: "2X-Small", "X-Small", "Small", "Medium", "Large", "X-Large", "2X-Large", "3X-Large", "4X-Large". -
min_num_clusters
- Minimum number of clusters available when a SQL warehouse is running. -
max_num_clusters
- Maximum number of clusters available when a SQL warehouse is running. -
auto_stop_mins
- Time in minutes until an idle SQL warehouse terminates all clusters and stops. -
tags
- tags used for SQL warehouse resources. -
spot_instance_policy
- The spot policy to use for allocating instances to clusters:COST_OPTIMIZED
orRELIABILITY_OPTIMIZED
. -
enable_photon
- Whether Photon is enabled. -
enable_serverless_compute
- Whether this SQL warehouse is a serverless SQL warehouse.-
For AWS: If your account needs updated terms of use, workspace admins are prompted in the Databricks SQL UI. A workspace must meet the requirements and might require an update its instance profile role to add a trust relationship.
-
For Azure, you must enable your workspace for serverless SQL warehouse.
-
-
channel
block, consisting of following fields:name
- Name of the Databricks SQL release channel. Possible values are:CHANNEL_NAME_PREVIEW
andCHANNEL_NAME_CURRENT
. Default isCHANNEL_NAME_CURRENT
.
-
jdbc_url
- JDBC connection string. -
odbc_params
- ODBC connection params:odbc_params.hostname
,odbc_params.path
,odbc_params.protocol
, andodbc_params.port
. -
data_source_id
- ID of the data source for this warehouse. This is used to bind an Databricks SQL query to an warehouse. -
creator_name
- The username of the user who created the endpoint. -
num_active_sessions
- The current number of clusters used by the endpoint. -
num_clusters
- The current number of clusters used by the endpoint. -
state
- The current state of the endpoint. -
health
- Health status of the endpoint.
The following resources are often used in the same context:
- End to end workspace management guide.
- databricks_instance_profile to manage AWS EC2 instance profiles that users can launch databricks_cluster and access data, like databricks_mount.
- databricks_sql_dashboard to manage Databricks SQL Dashboards.
- databricks_sql_global_config to configure the security policy, databricks_instance_profile, and data access properties for all databricks_sql_warehouse of workspace.
- databricks_sql_permissions to manage data object access control lists in Databricks workspaces for things like tables, views, databases, and more.