Skip to content

Commit

Permalink
Add BigLake API Table Resource (GoogleCloudPlatform#8754)
Browse files Browse the repository at this point in the history
* Add BigLake Table Resource

* add: biglake table example

* add: biglake table update test

* update: hive_options

* fix: wrong directory

* trim: trailing whitespace

* fix: incorrect example location

* delete: incorrect file

* add: acc tests for biglake table

* Add SDK provider configuration tests, part 2 (GoogleCloudPlatform#7723)

* Add tests for `providerConfigure` testing `billing_project` config

* Fix `billing_project` test

* Add tests for `providerConfigure` testing `region` config

* Add tests for `providerConfigure` testing `zone` config

* Add tests for `providerConfigure` testing `user_project_override` config

* Remove old redundant `TestHandleSDKDefaults_*` tests

* Update code following creation of `transport` package

* Reposition `transport_tpg` in imports

* Add WIP of scopes test

* Fix defect in test case

* Fix scopes test

* Implement test cases for user config including zero values, and update tests to asset when field should be unset but is found in config

* Make test error messages clearer w.r.t provider config from user versus derived Config struct

* Fix import issue after rebase

* Add comments separating types of test case, add cases for empty strings overwritten by ENVs

* Rebase PR to pull in changes, including service package split

* Add test cases that show empty strings are ignored and ENVs are used instead

* Update text in test failure messages, add comments to signpost different types of test case

* Make lack of error more explicit in test case

* Fix import issues from rebasing branch

* update use of `ioutil` to `os`

* Make imports match what's on main

* Update `testFakeCredentialsPath` to `transport_tpg.TestFakeCredentialsPath`

* Add missing comma

* Move file into `provider_test` package, update imports and remove duplicate code

* Fix defect when trying to access `CredentialsFromJSON` function

* Add missing function calls to set ENVs used in tests

* Update tests for region field, add test case for use of self links

* Aad test case for zone field, self links are not shortened and usable

* Remove test case; `ConflictsWith` on fields not testable at this level

* Update test case names, add comments

* TeamCity: Usability improvements : tag builds to distinguish nightly builds vs ad hoc builds, add project descriptions (GoogleCloudPlatform#8685)

* Add ability to tag TeamCity builds based on whether they're automated or ad-hoc. Nightly builds tagged with the date.

* Add ability to set project descriptions using a context parameter

* Refactor how date is formatted, to avoid problem where TeamCity interprets `%Y-%` as interpolating a `Y-` parameter

* Remove use of `TRIGGERED_BY`; value in build didn't match UI and isn't useful

* Update tag for nightly test builds to be static/consistent

* Add ignore read on reservedIpRange field Filestore Instance (GoogleCloudPlatform#8520)

Co-authored-by: Shuya Ma <87669292+shuyama1@users.noreply.github.com>

* b-283271112 Add "additionalScopes" under webSsoConfig for the "groups" (GoogleCloudPlatform#8744)

* Made breaking change detector own its own setup and build process (GoogleCloudPlatform#8705)

* Made breaking change detector own its own setup and build process

* Moved breaking change detector unit tests to github action

* Corrected breaking change detector unit test setup

* Added back package name updates for tpgb new

* made unit tests use a shallow clone

* Limit breaking change detector unit tests to runs that modify the tool

Co-authored-by: Scott Suarez <ScottSuarez@google.com>

* Update .ci/scripts/go-plus/github-differ/generate_comment.sh

Co-authored-by: Scott Suarez <ScottSuarez@google.com>

* Minor Cleanup

* Intentionally broke breaking change detector

* Revert "Intentionally broke breaking change detector"

This reverts commit bcb6ba8.

* Intentionally caused panic in breaking change detector at runtime

* Made a breaking change

* Added additional logging

* Removed export in generate_comment.sh

export hides the exit code of the command being run; assignment on its own does not. Export is not required in the context of a shell script

* Made failure get set to 1 instead of $?

* Added bin/ cleanup

* Revert "Intentionally caused panic in breaking change detector at runtime"

This reverts commit a16c0cd.

* Fixed package name replacement for google-beta

* Re-added export of TPG/TPGB BREAKING

* Added comment explaining the export location

* Revert "Made a breaking change"

This reverts commit 2deecd7.

---------

Co-authored-by: Scott Suarez <ScottSuarez@google.com>

* Go changelog (GoogleCloudPlatform#8727)

* Add copy of go-changelog to tools

* Add changelog checker to pre-build validation

* Log errors instead of commenting

* Move check changelog to separate workflow

* Mark lines for removal

* Remove unused go-changelog tools

* Apply suggestions from code review

Co-authored-by: Scott Suarez <ScottSuarez@google.com>

* Make old changelog checker a no-op

---------

Co-authored-by: Scott Suarez <ScottSuarez@google.com>

* Fix acctest import specifying the beta version instead of GA (GoogleCloudPlatform#8736)

* Cleanup deprecated folders (GoogleCloudPlatform#8720)

* Remove folder resources and data_sources

* Remove utility files

* Move more utility files

* Fix tgc

* Import provider package in tgc

* Remove utils files

* Remove test lines from changelog checker (GoogleCloudPlatform#8763)

* Adds the synthetic monitor target type as an option for uptime checks. (GoogleCloudPlatform#8709)

* Adds synthetic_monitor type that lives alongside resource_group and monitored_resource
* removes requirement for either tcp_check or http_check to be provided, as neither is required when synthetic_monitor is provided
* Adds acceptance test, and example. A new test fixutre w/ zip file is provided for these flows.

* Removed guard around operation WithResponse method & improved error handling (GoogleCloudPlatform#8762)

* Removed guard around operation WithResponse method

Resolved hashicorp/terraform-provider-google#15618

* Handle missing resource inside response more gracefully

* Add zli82016 to the vacation list (GoogleCloudPlatform#8761)

* Update example for google_compute_addresses data source (GoogleCloudPlatform#8765)

* Remove myself from vacation list (GoogleCloudPlatform#8770)

* Document database flags type change (GoogleCloudPlatform#8769)

* Added some missing what's next links (GoogleCloudPlatform#8733)

* Added some missing what's next links

* reverted changes to run-tests.md

* Add: BigLake API Catalog Resource (GoogleCloudPlatform#8751)

Co-authored-by: Shuya Ma <87669292+shuyama1@users.noreply.github.com>

* Add workflow for membership checker unit tests (GoogleCloudPlatform#8760)

* Add workflow for membership checker unit tests

* Apply suggestions from code review

Co-authored-by: Scott Suarez <ScottSuarez@google.com>

---------

Co-authored-by: Scott Suarez <ScottSuarez@google.com>

* Key upload (GoogleCloudPlatform#8714)

* add oidc key upload

* fix a bug

* fix a typo

* add example

* Update mmv1/products/iamworkforcepool/WorkforcePoolProvider.yaml

Co-authored-by: Stephen Lewis (Burrows) <stephen.r.burrows@gmail.com>

* remove unnecessary test

* add a new line

* fix a bug

* Update mmv1/products/iamworkforcepool/WorkforcePoolProvider.yaml

Co-authored-by: Stephen Lewis (Burrows) <stephen.r.burrows@gmail.com>

---------

Co-authored-by: Stephen Lewis (Burrows) <stephen.r.burrows@gmail.com>

* Marks `template.volumes.secret.items.mode` field not required in Cloud Run V2 resources (GoogleCloudPlatform#8771)

* removed require

* remove required for job

* Removed MembershipRBACRoleBinding from ga provider (GoogleCloudPlatform#8776)

* make distribution_policy_target_shape updatable (GoogleCloudPlatform#8774)

Co-authored-by: Edward Sun <sunedward@google.com>

* feat(google_container_cluster): support fqdn network policy (GoogleCloudPlatform#8461)

Signed-off-by: Tsubasa Nagasawa <toversus2357@gmail.com>

* Redis cluster terraform support (GoogleCloudPlatform#8567)

* Redis cluster terraform support

* updating cluster resource

* updating cluster resource

* updating cluster resource

* updating cluster resource

* updating cluster resource

* updating cluster resource

* update

* update

* update

* update

* update

* update

* update

* update

* update SCPolicy

* Update Service connection policy

* Update Service connection policy

---------

Co-authored-by: Himani Khanduja <khimani@google.com>

* move: Table.yaml to `biglake`

* update: switch to parent id convention

* update: example to use parent id convention

* update: continue to update to parent convention

* add: biglake table custom import

* fix: correct usage of names

* fix: extra comma

* fix: parameter spec.

* fix: parameters spec

* delete: remove location param

* fix: build

* fix: correct url

* add: spec custom import

* update: switch to id_format over custom import

* fix: remove location from resource test.

* fix: test names

* add: serde_info to acc test

* fix: serde_info is a struct

* remove: serde_info due to api bug.

* delete: unsupported serdeinfo field

* fix: use example person names

* update: mark `database` immutable

* update: change more fields in update test

---------

Signed-off-by: Tsubasa Nagasawa <toversus2357@gmail.com>
Co-authored-by: Sarah French <15078782+SarahFrench@users.noreply.github.com>
Co-authored-by: Baruch Steinberg <baruch.steinberg@gmail.com>
Co-authored-by: Shuya Ma <87669292+shuyama1@users.noreply.github.com>
Co-authored-by: vaibhav-google <142835342+vaibhav-google@users.noreply.github.com>
Co-authored-by: Stephen Lewis (Burrows) <stephenrlewis@google.com>
Co-authored-by: Scott Suarez <ScottSuarez@google.com>
Co-authored-by: Thomas Rodgers <thomasrodgers@google.com>
Co-authored-by: Ryan Oaks <ryanoaks@google.com>
Co-authored-by: Zhenhua Li <zhenhuali@google.com>
Co-authored-by: Daniel Koss <66844903+dkoss@users.noreply.github.com>
Co-authored-by: Ryan White <4404175+alzabo@users.noreply.github.com>
Co-authored-by: Riley Karson <rileykarson@google.com>
Co-authored-by: Sean McGivern <27fv8yygye@snkmail.com>
Co-authored-by: bohengy <108434983+bohengy@users.noreply.github.com>
Co-authored-by: Stephen Lewis (Burrows) <stephen.r.burrows@gmail.com>
Co-authored-by: Yanwei Guo <yanweiguo@google.com>
Co-authored-by: Edward Sun <42220489+edwardmedia@users.noreply.github.com>
Co-authored-by: Edward Sun <sunedward@google.com>
Co-authored-by: Tsubasa Nagasawa <toversus2357@gmail.com>
Co-authored-by: himanikh <himani.arora78@gmail.com>
Co-authored-by: Himani Khanduja <khimani@google.com>
  • Loading branch information
22 people authored and simonebruzzechesse committed Sep 7, 2023
1 parent d27905a commit 7ce36ca
Show file tree
Hide file tree
Showing 3 changed files with 301 additions and 0 deletions.
134 changes: 134 additions & 0 deletions mmv1/products/biglake/Table.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
# Copyright 2023 Google Inc.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

---
!ruby/object:Api::Resource
name: "Table"
description: |
Represents a table.
references: !ruby/object:Api::Resource::ReferenceLinks
guides:
"Manage open source metadata with BigLake Metastore": "https://cloud.google.com/bigquery/docs/manage-open-source-metadata#create_tables"
api: "https://cloud.google.com/bigquery/docs/reference/biglake/rest/v1/projects.locations.catalogs.databases.tables"
base_url: "{{database}}/tables"
self_link: "{{database}}/tables/{{name}}"
create_url: "{{database}}/tables?tableId={{name}}"
id_format: "{{database}}/tables/{{name}}"
import_format: ["{{%database}}/tables/{{name}}"]
update_verb: :PATCH
update_mask: true
examples:
- !ruby/object:Provider::Terraform::Examples
name: "biglake_table"
primary_resource_id: "table"
vars:
name: "my_table"
catalog: "my_catalog"
database: "my_database"
bucket: "my_bucket"

parameters:
- !ruby/object:Api::Type::String
name: "name"
required: true
immutable: true
url_param_only: true
description: |
Output only. The name of the Table. Format:
projects/{project_id_or_number}/locations/{locationId}/catalogs/{catalogId}/databases/{databaseId}/tables/{tableId}
- !ruby/object:Api::Type::String
name: "database"
immutable: true
url_param_only: true
description: |
The id of the parent database.
properties:
- !ruby/object:Api::Type::String
name: "createTime"
description: |
Output only. The creation time of the table. A timestamp in RFC3339 UTC
"Zulu" format, with nanosecond resolution and up to nine fractional
digits. Examples: "2014-10-02T15:01:23Z" and
"2014-10-02T15:01:23.045123456Z".
output: true
- !ruby/object:Api::Type::String
name: "updateTime"
description: |
Output only. The last modification time of the table. A timestamp in
RFC3339 UTC "Zulu" format, with nanosecond resolution and up to nine
fractional digits. Examples: "2014-10-02T15:01:23Z" and
"2014-10-02T15:01:23.045123456Z".
output: true
- !ruby/object:Api::Type::String
name: "deleteTime"
description: |
Output only. The deletion time of the table. Only set after the
table is deleted. A timestamp in RFC3339 UTC "Zulu" format, with
nanosecond resolution and up to nine fractional digits. Examples:
"2014-10-02T15:01:23Z" and "2014-10-02T15:01:23.045123456Z".
output: true
- !ruby/object:Api::Type::String
name: "expireTime"
description: |
Output only. The time when this table is considered expired. Only set
after the table is deleted. A timestamp in RFC3339 UTC "Zulu" format,
with nanosecond resolution and up to nine fractional digits. Examples:
"2014-10-02T15:01:23Z" and "2014-10-02T15:01:23.045123456Z".
output: true
- !ruby/object:Api::Type::String
name: "etag"
description: |
The checksum of a table object computed by the server based on the value
of other fields. It may be sent on update requests to ensure the client
has an up-to-date value before proceeding. It is only checked for update
table operations.
output: true

- !ruby/object:Api::Type::Enum
name: "type"
description: |
The database type.
values:
- :HIVE
- !ruby/object:Api::Type::NestedObject
name: "hiveOptions"
description: |
Options of a Hive table.
properties:
- !ruby/object:Api::Type::KeyValuePairs
name: "parameters"
description: |
Stores user supplied Hive table parameters. An object containing a
list of "key": value pairs.
Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.
- !ruby/object:Api::Type::String
name: "tableType"
description: |
Hive table type. For example, MANAGED_TABLE, EXTERNAL_TABLE.
- !ruby/object:Api::Type::NestedObject
name: "storageDescriptor"
description: |
Stores physical storage information on the data.
properties:
- !ruby/object:Api::Type::String
name: "locationUri"
description: |
Cloud Storage folder URI where the table data is stored, starting with "gs://".
- !ruby/object:Api::Type::String
name: "inputFormat"
description: |
The fully qualified Java class name of the input format.
- !ruby/object:Api::Type::String
name: "outputFormat"
description: |
The fully qualified Java class name of the output format.
61 changes: 61 additions & 0 deletions mmv1/templates/terraform/examples/biglake_table.tf.erb
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
resource "google_biglake_catalog" "catalog" {
name = "<%= ctx[:vars]['catalog'] %>"
location = "US"
}

resource "google_storage_bucket" "bucket" {
name = "<%= ctx[:vars]['bucket'] %>"
location = "US"
force_destroy = true
uniform_bucket_level_access = true
}

resource "google_storage_bucket_object" "metadata_folder" {
name = "metadata/"
content = " "
bucket = google_storage_bucket.bucket.name
}


resource "google_storage_bucket_object" "data_folder" {
name = "data/"
content = " "
bucket = google_storage_bucket.bucket.name
}

resource "google_biglake_database" "database" {
name = "<%= ctx[:vars]['database'] %>"
catalog = google_biglake_catalog.catalog.id
type = "HIVE"
hive_options {
location_uri = "gs://${google_storage_bucket.bucket.name}/${google_storage_bucket_object.metadata_folder.name}"
parameters = {
"owner" = "Alex"
}
}
}

resource "google_biglake_table" "<%= ctx[:primary_resource_id] %>" {
name = "<%= ctx[:vars]['name'] %>"
database = google_biglake_database.database.id
type = "HIVE"
hive_options {
table_type = "MANAGED_TABLE"
storage_descriptor {
location_uri = "gs://${google_storage_bucket.bucket.name}/${google_storage_bucket_object.data_folder.name}"
input_format = "org.apache.hadoop.mapred.SequenceFileInputFormat"
output_format = "org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat"
}
# Some Example Parameters.
parameters = {
"spark.sql.create.version" = "3.1.3"
"spark.sql.sources.schema.numParts" = "1"
"transient_lastDdlTime" = "1680894197"
"spark.sql.partitionProvider" = "catalog"
"owner" = "John Doe"
"spark.sql.sources.schema.part.0"= "{\"type\":\"struct\",\"fields\":[{\"name\":\"id\",\"type\":\"integer\",\"nullable\":true,\"metadata\":{}},{\"name\":\"name\",\"type\":\"string\",\"nullable\":true,\"metadata\":{}},{\"name\":\"age\",\"type\":\"integer\",\"nullable\":true,\"metadata\":{}}]}"
"spark.sql.sources.provider" = "iceberg"
"provider" = "iceberg"
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
package biglake_test

import (
"testing"

"github.com/hashicorp/terraform-plugin-sdk/v2/helper/resource"

"github.com/hashicorp/terraform-provider-google/google/acctest"
)

func TestAccBiglakeTable_biglakeTable_update(t *testing.T) {
t.Parallel()

context := map[string]interface{}{
"random_suffix": acctest.RandString(t, 10),
}

acctest.VcrTest(t, resource.TestCase{
PreCheck: func() { acctest.AccTestPreCheck(t) },
ProtoV5ProviderFactories: acctest.ProtoV5ProviderFactories(t),
CheckDestroy: testAccCheckBiglakeTableDestroyProducer(t),
Steps: []resource.TestStep{
{
Config: testAccBiglakeTable_biglakeTableExample(context),
},
{
ResourceName: "google_biglake_table.table",
ImportState: true,
ImportStateVerify: true,
ImportStateVerifyIgnore: []string{"name", "database"},
},
{
Config: testAccBiglakeTable_biglakeTable_update(context),
},
{
ResourceName: "google_biglake_table.table",
ImportState: true,
ImportStateVerify: true,
ImportStateVerifyIgnore: []string{"name", "database"},
},
},
})
}

func testAccBiglakeTable_biglakeTable_update(context map[string]interface{}) string {
return acctest.Nprintf(`
resource "google_biglake_catalog" "catalog" {
name = "tf_test_my_catalog%{random_suffix}"
location = "US"
}
resource "google_storage_bucket" "bucket" {
name = "tf_test_my_bucket%{random_suffix}"
location = "US"
force_destroy = true
uniform_bucket_level_access = true
}
resource "google_storage_bucket_object" "metadata_folder" {
name = "metadata/"
content = " "
bucket = google_storage_bucket.bucket.name
}
resource "google_storage_bucket_object" "data_folder" {
name = "data/"
content = " "
bucket = google_storage_bucket.bucket.name
}
resource "google_biglake_database" "database" {
name = "tf_test_my_database%{random_suffix}"
catalog = google_biglake_catalog.catalog.id
type = "HIVE"
hive_options {
location_uri = "gs://${google_storage_bucket.bucket.name}/${google_storage_bucket_object.metadata_folder.name}"
parameters = {
"owner" = "Alex"
}
}
}
resource "google_biglake_table" "table" {
name = "tf_test_my_table%{random_suffix}"
database = google_biglake_database.database.id
type = "HIVE"
hive_options {
table_type = "EXTERNAL_TABLE"
storage_descriptor {
location_uri = "gs://${google_storage_bucket.bucket.name}/${google_storage_bucket_object.data_folder.name}/data"
input_format = "org.apache.hadoop.mapred.SequenceFileInputFormat2"
output_format = "org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat2"
}
# Some Example Parameters.
parameters = {
# Bump the version.
"spark.sql.create.version" = "3.1.7"
"spark.sql.sources.schema.numParts" = "1"
# Update the time.
"transient_lastDdlTime" = "1680895000"
"spark.sql.partitionProvider" = "catalog"
# Change The Name
"owner" = "Dana"
"spark.sql.sources.schema.part.0" = "{\"type\":\"struct\",\"fields\":[{\"name\":\"id\",\"type\":\"integer\",\"nullable\":true,\"metadata\":{}},{\"name\":\"name\",\"type\":\"string\",\"nullable\":true,\"metadata\":{}},{\"name\":\"age\",\"type\":\"integer\",\"nullable\":true,\"metadata\":{}}]}"
"spark.sql.sources.provider": "iceberg"
"provider" = "iceberg"
}
}
}
`, context)
}

0 comments on commit 7ce36ca

Please sign in to comment.