Skip to content

Commit

Permalink
Fix more doc issues (#6072)
Browse files Browse the repository at this point in the history
  • Loading branch information
tuliren authored Sep 15, 2021
1 parent aca8c50 commit 2390b54
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 6 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
"title": "Databricks Personal Access Token",
"type": "string",
"description": "",
"examples": [""],
"examples": ["dapi0123456789abcdefghij0123456789AB"],
"airbyte_secret": true
},
"database_schema": {
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
# Lever Hiring Source

This is the repository for the Lever Hiring source connector, written in Python.
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/lever-hiring).

## Local development

Expand Down
9 changes: 5 additions & 4 deletions docs/integrations/destinations/databricks.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,10 @@ Due to legal reasons, this is currently a private connector that is only availab
| Incremental - Dedupe Sync || |
| Namespaces || |

## Configuration
## Data Source
Databricks supports various cloud storage as the [data source](https://docs.databricks.com/data/data-sources/index.html). Currently, only Amazon S3 is supported.

Databricks parameters
## Configuration

| Category | Parameter | Type | Notes |
| :--- | :--- | :---: | :--- |
Expand All @@ -28,8 +29,8 @@ Databricks parameters
| | Port | string | Optional. Default to "443". See [documentation](https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html#get-server-hostname-port-http-path-and-jdbc-url). |
| | Personal Access Token | string | Required. See [documentation](https://docs.databricks.com/sql/user/security/personal-access-tokens.html). |
| General | Database schema | string | Optional. Default to "public". Each data stream will be written to a table under this database schema. |
| | Purge Staging Files and Tables | The connector creates staging files and tables on S3. By default they will be purged when the data sync is complete. Set it to `false` for debugging purpose. |
| S3 | Bucket Name | string | Name of the bucket to sync data into. |
| | Purge Staging Data | boolean | The connector creates staging files and tables on S3. By default they will be purged when the data sync is complete. Set it to `false` for debugging purpose. |
| Data Source - S3 | Bucket Name | string | Name of the bucket to sync data into. |
| | Bucket Path | string | Subdirectory under the above bucket to sync the data into. |
| | Region | string | See [documentation](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-available-regions) for all region codes. |
| | Access Key ID | string | AWS/Minio credential. |
Expand Down

0 comments on commit 2390b54

Please sign in to comment.