Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more readability for errors #129

Merged
merged 33 commits into from
Jun 25, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
c42bb01
Add resource wait retry for workspace create
nfx Jun 19, 2020
4186055
Made vscode integration testing simpler
nfx Jun 19, 2020
4dff4d1
added read support for username/password for config files
nfx Jun 19, 2020
1278b62
Made errors concise and explainable
nfx Jun 19, 2020
c580ad7
Fix formatting issue
nfx Jun 19, 2020
c42d4c4
cleaned up token request structs
nfx Jun 21, 2020
457ef82
make fmt
nfx Jun 21, 2020
0e235c1
Added some documentation
nfx Jun 21, 2020
3fd44f7
keying composite literals
nfx Jun 21, 2020
97361c6
Add resource wait retry for workspace create
nfx Jun 19, 2020
285995a
Made vscode integration testing simpler
nfx Jun 19, 2020
9b09084
added read support for username/password for config files
nfx Jun 19, 2020
4b931c9
Made errors concise and explainable
nfx Jun 19, 2020
0ea8c51
Fix formatting issue
nfx Jun 19, 2020
8b17019
cleaned up token request structs
nfx Jun 21, 2020
a64ba56
make fmt
nfx Jun 21, 2020
94ac0e3
Added some documentation
nfx Jun 21, 2020
0f68309
keying composite literals
nfx Jun 21, 2020
a076e0d
Add missing resource check in resourceClusterPolicyRead
nfx Jun 22, 2020
e631e3c
Apply review comments
nfx Jun 22, 2020
2c063c6
Merge branch 'fixes' of github.com:databrickslabs/terraform-provider-…
nfx Jun 22, 2020
bbb9a7e
More correct implementation of 404-check
nfx Jun 22, 2020
05d59ec
Make README more user-friendly
nfx Jun 22, 2020
7e8f0ff
More links
nfx Jun 22, 2020
3a2d754
added integration test to verify that all apis can either handle 404s…
stikkireddy Jun 25, 2020
a8747e0
added skip for testMissingWorkspaceResources to run only if TF_ACC is…
stikkireddy Jun 25, 2020
94580cb
Merge branches 'fixes' and 'master' of github.com:databrickslabs/terr…
stikkireddy Jun 25, 2020
7cdaa55
cleaned up makefile
stikkireddy Jun 25, 2020
742ded6
refactored tokenexpirytime to the api client config so the client is …
stikkireddy Jun 25, 2020
bbb4c75
added a int test missing cluster policy
stikkireddy Jun 25, 2020
2a69c2d
adjusted the headers on the index and added the id attribute for clus…
stikkireddy Jun 25, 2020
69ad2fd
corrected typos
stikkireddy Jun 25, 2020
0ecd736
fix another credentials typo
stikkireddy Jun 25, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
56 changes: 32 additions & 24 deletions .vscode/launch.json
Original file line number Diff line number Diff line change
@@ -1,25 +1,33 @@
{
    // Use IntelliSense to learn about possible attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Launch test function",
            "type": "go",
            "request": "launch",
            "mode": "test",
            "program": "${workspaceRoot}/databricks/resource_databricks_azure_adls_gen2_mount_test.go",
            "args": [
                "-test.v",
                "-test.run",
                "TestAccAzureAdlsGen2Mount_capture_error"
            ],
"env": {
"TF_ACC" : "1"
// "TEST_RESOURCE_GROUP" : "${env:TEST_RESOURCE_GROUP}"
},
            "showLog": true
        }
    ]
}
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Launch test function",
"type": "go",
"request": "launch",
"mode": "test",
"program": "${file}",
"args": [
"-test.v",
"-test.run",
"${selectedText}"
],
"dlvLoadConfig": {
"followPointers": true,
"maxVariableRecurse": 1,
"maxStringLen": 64,
"maxArrayValues": 64,
"maxStructFields": -1
},
"env": {
"TF_ACC": "1",
"DATABRICKS_CONFIG_PROFILE": "sandbox"
// "TEST_RESOURCE_GROUP" : "${env:TEST_RESOURCE_GROUP}"
},
"showLog": true
}
]
}
1 change: 1 addition & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
{
"go.testFlags": ["-v"],
"go.delveConfig": {
"dlvLoadConfig": {
"followPointers": true,
Expand Down
209 changes: 208 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1 +1,208 @@
We happily welcome contributions to databricks-terraform. We use GitHub Issues to track community reported issues and GitHub Pull Requests for accepting changes.
Contributing to Databricks Terraform Provider
---

- [Contributing to Databricks Terraform Provider](#contributing-to-databricks-terraform-provider)
- [Install using go](#install-using-go)
- [Downloading the source code and installing the artifact](#downloading-the-source-code-and-installing-the-artifact)
- [Developing with Visual Studio Code Devcontainers](#developing-with-visual-studio-code-devcontainers)
- [Building and Installing with Docker](#building-and-installing-with-docker)
- [Testing](#testing)
- [Linting](#linting)
- [Integration Testing](#integration-testing)
- [Project Components](#project-components)
- [Databricks Terraform Provider Resources State](#databricks-terraform-provider-resources-state)
- [Databricks Terraform Data Sources State](#databricks-terraform-data-sources-state)

We happily welcome contributions to databricks-terraform. We use GitHub Issues to track community reported issues and GitHub Pull Requests for accepting changes.

## Install using go

Please note that there is a Makefile which contains all the commands you would need to run this project.

This code base to contribute to requires the following software (this is also all configured for the [Visual Studio Code Devcontainer](#developing-with-visual-studio-code-devcontainers)):

* [golang 1.13.X](https://golang.org/dl/)
* [terraform v0.12.x](https://www.terraform.io/downloads.html)
* make command

To make sure everything is installed correctly please run the following commands:

Testing go installation:
```bash
$ go version
go version go1.13.3 darwin/amd64
```

Testing terraform installation:
```bash
$ terraform --version
Terraform v0.12.19

Your version of Terraform is out of date! The latest version
is 0.12.24. You can update by downloading from https://www.terraform.io/downloads.html

```

Testing make installation:
```bash
$ make --version
GNU Make 3.81
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE.

This program built for i386-apple-darwin11.3.0
```

## Downloading the source code and installing the artifact

* After installing `golang`, `terraform`, and `make` you will now build the artifact.

```bash
$ go get -v -u github.com/databrickslabs/databricks-terraform && cd $GOPATH/src/github.com/databrickslabs/databricks-terraform
```

:warning: If you are fetching from a private repository please use the following command:

```bash
$ GOSUMDB=off GOPROXY=direct go get -v -u github.com/databrickslabs/databricks-terraform && cd $GOPATH/src/github.com/databrickslabs/databricks-terraform
```

* When you are in the root directory of the repository please run:

```bash
$ make build
```

* Locate your [terraform plugins directory](https://www.terraform.io/docs/extend/how-terraform-works.html#plugin-locations)
or the root folder of your terraform code

* Copy the `terraform-provider-databricks` artifact to that terraform plugins locations

```bash
$ mkdir -p ~/.terraform.d/plugins/ && cp terraform-provider-databricks ~/.terraform.d/plugins/terraform-provider-databricks
```

Now your plugin for the Databricks Terraform provider is installed correctly. You can actually use the provider.

## Developing with Visual Studio Code Devcontainers

This project has configuration for working with [Visual Studio Code Devcontainers](https://code.visualstudio.com/docs/remote/containers) - this allows you to containerise your development prerequisites (e.g. golang, terraform). To use this you will need [Visual Studio Code](https://code.visualstudio.com/) and [Docker](https://www.docker.com/products/docker-desktop).

To get started, clone this repo and open the folder with Visual Studio Code. If you don't have the [Remote Development extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack) then you should be prompted to install it.

Once the folder is loaded and the extension is installed you should be prompted to re-open the folder in a devcontainer. This will built and run the container image with the correct tools (and versions) ready to start working on and building the code. The in-built terminal will launch a shell inside the container for running `make` commands etc.

See the docs for more details on working with [devcontainers](https://code.visualstudio.com/docs/remote/containers).

## Building and Installing with Docker

To install and build the code if you dont want to install golang, terraform, etc. All you need is docker and git.

First make sure you clone the repository and you are in the directory.

Then build the docker image with this command:

```bash
$ docker build -t databricks-terraform .
```

Then run the execute the terraform binary via the following command and volume mount. Make sure that you are in the directory
with the terraform code. The following command you can execute the following commands and additional ones as part of
the terraform binary.

```bash
$ docker run -it -v $(pwd):/workpace -w /workpace databricks-terraform init
$ docker run -it -v $(pwd):/workpace -w /workpace databricks-terraform plan
$ docker run -it -v $(pwd):/workpace -w /workpace databricks-terraform apply
```

## Testing

* [ ] Integration tests should be run at a client level against both azure and aws to maintain sdk parity against both apis **(currently only on one cloud)**
* [x] Terraform acceptance tests should be run against both aws and azure to maintain parity of provider between both cloud services **(currently only on one cloud)**

## Linting

Please use makefile for linting. If you run `golangci-lint` by itself it will fail due to different tags containing same functions.
So please run `make lint` instead.

## Integration Testing

Currently Databricks supports two cloud providers `azure` and `aws` thus integration testing with the correct cloud service provider is
crucial for making sure that the provider behaves as expected on all supported clouds. This type of testing separation is being managed via build tags
to allow for duplicate method names and environment variables to configure clients.

The current integration test implementation uses `CLOUD_ENV` environment variable and can use the value of `azure` or `aws`.
You can execute the acceptance with the following make commands `make terraform-acc-azure`, and `make terraform-acc-aws` for
azure and aws respectively.

This involves bootstrapping the provider via a .env configuration file. Without these files in the root directory the tests
will fail as the provider will not have a authorized token and host.

The configuration file for `aws` should be like the following and be named `.aws.env`:
```.env
DATABRICKS_HOST=<host>
DATABRICKS_TOKEN=<token>
```

The configuration file for `azure` should be like the following and be named `.azure.env`:
```.env
DATABRICKS_AZURE_CLIENT_ID=<enterprise app client id>
DATABRICKS_AZURE_CLIENT_SECRET=<enterprise app client secret>
DATABRICKS_AZURE_TENANT_ID=<azure ad tenant id>
DATABRICKS_AZURE_SUBSCRIPTION_ID=<azure subscription id>
DATABRICKS_AZURE_RESOURCE_GROUP=<resource group where the workspace is>
AZURE_REGION=<region where the workspace is>
DATABRICKS_AZURE_MANAGED_RESOURCE_GROUP=<azure databricks managed resource group for workspace>
DATABRICKS_AZURE_WORKSPACE_NAME=<azure databricks workspace name>
```

Note that azure integration tests will use service principal based auth. Even though it is using a service principal,
it will still be generating a personal access token to perform creation of resources.


## Project Components

### Databricks Terraform Provider Resources State

| Resource | Implemented | Import Support | Acceptance Tests | Documentation | Reviewed | Finalize Schema |
|----------------------------------|--------------------|----------------------|----------------------|----------------------|----------------------|----------------------|
| databricks_token | :white_check_mark: | :white_large_square: | :white_check_mark: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_secret_scope | :white_check_mark: | :white_large_square: | :white_check_mark: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_secret | :white_check_mark: | :white_large_square: | :white_check_mark: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_secret_acl | :white_check_mark: | :white_large_square: | :white_check_mark: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_instance_pool | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_scim_user | :white_check_mark: | :white_large_square: | :white_check_mark: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_scim_group | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_notebook | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_cluster | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_job | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_dbfs_file | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_dbfs_file_sync | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_instance_profile | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_aws_s3_mount | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_azure_blob_mount | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_azure_adls_gen1_mount | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |
| databricks_azure_adls_gen2_mount | :white_check_mark: | :white_large_square: | :white_large_square: | :white_check_mark: | :white_large_square: | :white_large_square: |

### Databricks Terraform Data Sources State

| Data Source | Implemented | Acceptance Tests | Documentation | Reviewed |
|-----------------------------|----------------------|----------------------|----------------------|----------------------|
| databricks_notebook | :white_check_mark: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_notebook_paths | :white_check_mark: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_dbfs_file | :white_check_mark: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_dbfs_file_paths | :white_check_mark: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_zones | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_runtimes | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_instance_pool | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_scim_user | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_scim_group | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_cluster | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_job | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_mount | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_instance_profile | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_database | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
| databricks_table | :white_large_square: | :white_large_square: | :white_large_square: | :white_large_square: |
7 changes: 0 additions & 7 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -48,9 +48,6 @@ vendor:
@echo "==> Filling vendor folder with library code..."
@go mod vendor

local-install: build
mv terraform-provider-databricks $(HOME)/.terraform.d/plugins/terraform-provider-databricks_v0.2.0

# INTEGRATION TESTING WITH AZURE
terraform-acc-azure: lint
@echo "==> Running Terraform Acceptance Tests for Azure..."
Expand Down Expand Up @@ -82,8 +79,4 @@ hugo:
@echo "==> Making Docs..."
@cd website && hugo -d ../docs/

internal-docs-sync:
@echo "==> Uploading Website..."
@azcopy login --service-principal --application-id $(AZCOPY_SPA_CLIENT_ID) --tenant-id=$(AZCOPY_SPA_TENANT_ID) && azcopy sync './website/public' '$(AZCOPY_STORAGE_ACCT)' --recursive

.PHONY: build fmt python-setup docs vendor terraform-local build fmt coverage test lint
Loading