Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V1.8.0b1 #30

Open
wants to merge 26 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
e5b503e
1.8.0 upgrade
prdpsvs Jun 24, 2024
a16618c
Shortcuts feature
prdpsvs Jun 27, 2024
f9c5321
Shortcuts feature
prdpsvs Jun 27, 2024
a1d9675
Adding tests related to shortcuts
prdpsvs Jul 2, 2024
29ee663
Adding support of shortcuts
prdpsvs Dec 8, 2024
1ebedcc
Adding support of shortcuts
prdpsvs Dec 8, 2024
1c5e2a4
write permissions to id-token in the workflow
prdpsvs Dec 8, 2024
ebe52ad
Added unit testing functional tests, updated integration.yaml fle to …
prdpsvs Jan 4, 2025
8b871f3
renamed dev-requirements to dev_requirements.txt
prdpsvs Jan 4, 2025
849f64e
renamed dev-requirements to dev_requirements.txt
prdpsvs Jan 4, 2025
67dc512
updated logger statements
prdpsvs Jan 4, 2025
7eb2d88
updated logger statements
prdpsvs Jan 4, 2025
c5a23b6
updated logger statements
prdpsvs Jan 4, 2025
3d08e20
updated logger statements
prdpsvs Jan 4, 2025
c8a65c7
updated logger statements
prdpsvs Jan 4, 2025
8b1e8e4
updated logger statements
prdpsvs Jan 4, 2025
9b20cf0
updated logger statements
prdpsvs Jan 4, 2025
e77352e
updated logger statements
prdpsvs Jan 4, 2025
fefa2e5
updated logger statements
prdpsvs Jan 4, 2025
3e9078e
updated logger statements
prdpsvs Jan 4, 2025
b465b99
updated logger statements
prdpsvs Jan 4, 2025
d53f1f2
updated logger statements
prdpsvs Jan 4, 2025
a8073f8
updated logger statements
prdpsvs Jan 4, 2025
eafb0c1
updated logger statements
prdpsvs Jan 4, 2025
550798a
updated logger statements
prdpsvs Jan 4, 2025
c12e1ef
updated logger statements
prdpsvs Jan 4, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 0 additions & 43 deletions .github/workflows/docs-issues.yml

This file was deleted.

87 changes: 57 additions & 30 deletions .github/workflows/integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,49 +9,76 @@
# branch, and when manually triggered.

name: Adapter Integration Tests
on: # yamllint disable-line rule:truthy
on:
workflow_dispatch:
pull_request:
branches:
- 'stale*' # Currently setting this inactive as Fabric Spark adapter does not SPN auth.
- "main"
- "*.latest"
- "v*"

# explicitly turn off permissions for `GITHUB_TOKEN`
permissions: read-all

# will cancel previous workflows triggered by the same event and for the same ref for PRs or same SHA otherwise
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ contains(github.event_name, 'pull_request_target') && github.event.pull_request.head.ref || github.sha }}
cancel-in-progress: true

defaults:
run:
shell: bash
jobs:
test:
name: ${{ matrix.test }}
integration-tests-fabric-dw:
name: Regular
runs-on: ubuntu-latest
permissions:
contents: read # Required to access repository files
packages: read # Grant explicit read access to packages
id-token: write # Needed if using OIDC authentication
strategy:
fail-fast: false
max-parallel: 1
matrix:
test:
- "az_spn"
test_file:
- tests/functional/adapter/basic/test_base.py
- tests/functional/adapter/basic/test_empty.py

steps:
- name: AZ CLI login
run: az login --service-principal --username="${AZURE_CLIENT_ID}" --password="${AZURE_CLIENT_SECRET}" --tenant="${AZURE_TENANT_ID}"
env:
AZURE_CLIENT_ID: ${{ secrets.DBT_FABRIC_SPARK_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.DBT_FABRIC_SPARK_CLIENT_SECRET }}
AZURE_TENANT_ID: ${{ secrets.DBT_FABRIC_SPARK_TENANT_ID }}
- name: Azure login with OIDC
uses: azure/login@v2
with:
client-id: ${{ secrets.DBT_AZURE_SP_NAME }}
tenant-id: ${{ secrets.DBT_AZURE_TENANT }}
allow-no-subscriptions: true
federated-token: true

- name: Fetch Access Token
id: fetch_token
run: |
pip install azure-identity pyodbc azure-core

python - <<EOF
from azure.core.credentials import AccessToken
from azure.identity import DefaultAzureCredential
import os
try:
credential = DefaultAzureCredential()
token = credential.get_token("https://analysis.windows.net/powerbi/api/.default")
with open("token.txt", "w") as file:
file.write(token.token)
print(f"::set-output name=access_token::{token.token}")
except Exception as e:
raise RuntimeError(f"Failed to fetch token: {e}")
EOF

- name: Upload token.txt
uses: actions/upload-artifact@v4
with:
name: token-file
path: token.txt

- uses: actions/checkout@v4

- name: Install dependencies
run: pip install -r dev_requirements.txt

- name: Run functional tests
- name: Run Functional Test ${{ matrix.test_file }}
env:
DBT_FABRIC_SPARK_WORKSPACE_ID: ${{ secrets.DBT_FABRIC_SPARK_WORKSPACE_ID }}
DBT_FABRIC_SPARK_LAKEHOUSE_ID: ${{ secrets.DBT_FABRIC_SPARK_LAKEHOUSE_ID }}
DBT_FABRIC_SPARK_CLIENT_ID: ${{ secrets.DBT_FABRIC_SPARK_CLIENT_ID }}
DBT_FABRIC_SPARK_CLIENT_SECRET: ${{ secrets.DBT_FABRIC_SPARK_CLIENT_SECRET }}
DBT_FABRIC_SPARK_TENANT_ID: ${{ secrets.DBT_FABRIC_SPARK_TENANT_ID }}
run: pytest -ra -v tests/functional --profile "${{ matrix.test }}"
WORKSPACE_ID: ${{ secrets.WORKSPACE_ID }}
LAKEHOUSE_ID: ${{ secrets.LAKEHOUSE_ID }}
LAKEHOUSE_NAME: ${{ secrets.LAKEHOUSE_NAME }}
SCHEMA_NAME: ${{ secrets.LAKEHOUSE_NAME }}
CLIENT_ID: ${{ secrets.DBT_AZURE_SP_NAME }}
TENANT_ID: ${{ secrets.DBT_AZURE_TENANT }}
FABRIC_INTEGRATION_TESTS_TOKEN: ${{ steps.fetch_token.outputs.access_token }}
run: pytest -ra -v ${{ matrix.test_file }} --profile "int_tests"
8 changes: 5 additions & 3 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,10 @@ on:
pull_request:
workflow_dispatch:

permissions: read-all
permissions:
contents: read # Required to access repository files
packages: read # Grant explicit read access to packages
id-token: write # Needed if using OIDC authentication

# will cancel previous workflows triggered by the same event and for the same ref for PRs or same SHA otherwise
concurrency:
Expand Down Expand Up @@ -63,8 +66,7 @@ jobs:
python -m pip install mypy==0.942
python -m pip install types-requests
mypy --version
python -m pip install -r requirements.txt
python -m pip install -r dev-requirements.txt
python -m pip install -r dev_requirements.txt
dbt --version

- name: Run pre-commit hooks
Expand Down
7 changes: 6 additions & 1 deletion .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,11 @@ on: # yamllint disable-line rule:truthy
tags:
- 'v*'

permissions:
contents: read # Required to access repository files
packages: read # Grant explicit read access to packages
id-token: write # Needed if using OIDC authentication

jobs:
release-version:
name: Release new version
Expand All @@ -33,7 +38,7 @@ jobs:
python-version: '3.11'

- name: Install dependencies
run: pip install -r dev-requirements.txt
run: pip install -r dev_requirements.txt

- name: Initialize .pypirc
run: |
Expand Down
44 changes: 22 additions & 22 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -38,26 +38,26 @@ repos:
- id: flake8
alias: flake8-check
stages: [manual]
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.2.0
hooks:
- id: mypy
# N.B.: Mypy is... a bit fragile.
#
# By using `language: system` we run this hook in the local
# environment instead of a pre-commit isolated one. This is needed
# to ensure mypy correctly parses the project.
# - repo: https://github.com/pre-commit/mirrors-mypy
# rev: v1.2.0
# hooks:
# - id: mypy
# # N.B.: Mypy is... a bit fragile.
# #
# # By using `language: system` we run this hook in the local
# # environment instead of a pre-commit isolated one. This is needed
# # to ensure mypy correctly parses the project.

# It may cause trouble in that it adds environmental variables out
# of our control to the mix. Unfortunately, there's nothing we can
# do about per pre-commit's author.
# See https://github.com/pre-commit/pre-commit/issues/730 for details.
args: [--show-error-codes, --ignore-missing-imports, --explicit-package-bases, --warn-unused-ignores, --disallow-untyped-defs]
files: ^dbt/adapters/.*
language: system
- id: mypy
alias: mypy-check
stages: [manual]
args: [--show-error-codes, --pretty, --ignore-missing-imports, --explicit-package-bases]
files: ^dbt/adapters
language: system
# # It may cause trouble in that it adds environmental variables out
# # of our control to the mix. Unfortunately, there's nothing we can
# # do about per pre-commit's author.
# # See https://github.com/pre-commit/pre-commit/issues/730 for details.
# args: [--show-error-codes, --ignore-missing-imports, --explicit-package-bases, --warn-unused-ignores, --disallow-untyped-defs]
# files: ^dbt/adapters/.*
# language: system
# - id: mypy
# alias: mypy-check
# stages: [manual]
# args: [--show-error-codes, --pretty, --ignore-missing-imports, --explicit-package-bases]
# files: ^dbt/adapters
# language: system
10 changes: 5 additions & 5 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -3,18 +3,18 @@
.PHONY: dev
dev: ## Installs adapter in develop mode along with development dependencies
@\
pip install -e . -r requirements.txt -r dev-requirements.txt && pre-commit install
pip install -e . -r dev_requirements.txt && pre-commit install

.PHONY: dev-uninstall
dev-uninstall: ## Uninstalls all packages while maintaining the virtual environment
## Useful when updating versions, or if you accidentally installed into the system interpreter
pip freeze | grep -v "^-e" | cut -d "@" -f1 | xargs pip uninstall -y
pip uninstall -y dbt-spark

.PHONY: mypy
mypy: ## Runs mypy against staged changes for static type checking.
@\
pre-commit run --hook-stage manual mypy-check | grep -v "INFO"
#.PHONY: mypy
#mypy: ## Runs mypy against staged changes for static type checking.
# @\
# pre-commit run --hook-stage manual mypy-check | grep -v "INFO"

.PHONY: flake8
flake8: ## Runs flake8 against staged changes to enforce style guide.
Expand Down
2 changes: 1 addition & 1 deletion dbt/adapters/fabricspark/__version__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
version = "1.7.0rc1"
version = "1.8.0b1"
2 changes: 1 addition & 1 deletion dbt/adapters/fabricspark/column.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from dataclasses import dataclass
from typing import Any, Dict, Optional, TypeVar, Union
from dbt.adapters.base.column import Column
from dbt.dataclass_schema import dbtClassMixin
from dbt_common.dataclass_schema import dbtClassMixin

Self = TypeVar("Self", bound="SparkColumn")

Expand Down
39 changes: 21 additions & 18 deletions dbt/adapters/fabricspark/connections.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,20 @@
from contextlib import contextmanager
import os
import dbt.exceptions

from dbt_common.exceptions import DbtConfigError, DbtRuntimeError
from dbt.adapters.contracts.connection import (
AdapterResponse,
ConnectionState,
Connection,
)
from dbt.adapters.sql import SQLConnectionManager
from dbt.contracts.connection import ConnectionState, AdapterResponse
from dbt.events import AdapterLogger
from dbt.events.functions import fire_event
from dbt.events.types import ConnectionUsed, SQLQuery, SQLQueryStatus
from dbt.utils import DECIMALS
from dbt.adapters.events.logging import AdapterLogger
from dbt.adapters.exceptions import FailedToConnectError
from dbt.adapters.events.types import ConnectionUsed, SQLQuery, SQLQueryStatus
from dbt_common.events.functions import fire_event
from dbt_common.utils.encoding import DECIMALS
from dbt.adapters.fabricspark.livysession import LivySessionConnectionWrapper, LivySessionManager

from dbt.contracts.connection import Connection
from dbt.dataclass_schema import StrEnum
from dbt_common.dataclass_schema import StrEnum
from typing import Any, Optional, Union, Tuple, List, Generator, Iterable, Sequence
from abc import ABC, abstractmethod
import time
Expand Down Expand Up @@ -88,9 +91,9 @@ def exception_handler(self, sql: str) -> Generator[None, None, None]:
thrift_resp = exc.args[0]
if hasattr(thrift_resp, "status"):
msg = thrift_resp.status.errorMessage
raise dbt.exceptions.DbtRuntimeError(msg)
raise DbtRuntimeError(msg)
else:
raise dbt.exceptions.DbtRuntimeError(str(exc))
raise DbtRuntimeError(str(exc))

def cancel(self, connection: Connection) -> None:
connection.handle.cancel()
Expand Down Expand Up @@ -120,7 +123,7 @@ def validate_creds(cls, creds: Any, required: Iterable[str]) -> None:

for key in required:
if not hasattr(creds, key):
raise dbt.exceptions.DbtProfileError(
raise DbtConfigError(
"The config '{}' is required when using the {} method"
" to connect to Spark".format(key, method)
)
Expand Down Expand Up @@ -151,9 +154,7 @@ def open(cls, connection: Connection) -> Connection:
logger.debug("Connection error: {}".format(ex))
connection.state = ConnectionState.FAIL
else:
raise dbt.exceptions.DbtProfileError(
f"invalid credential method: {creds.method}"
)
raise DbtConfigError(f"invalid credential method: {creds.method}")
break
except Exception as e:
exc = e
Expand All @@ -163,7 +164,7 @@ def open(cls, connection: Connection) -> Connection:
msg = "Failed to connect"
if creds.token is not None:
msg += ", is your token valid?"
raise dbt.exceptions.FailedToConnectError(msg) from e
raise FailedToConnectError(msg) from e
retryable_message = _is_retryable_error(e)
if retryable_message and creds.connect_retries > 0:
msg = (
Expand All @@ -184,12 +185,14 @@ def open(cls, connection: Connection) -> Connection:
logger.warning(msg)
time.sleep(creds.connect_timeout)
else:
raise dbt.exceptions.FailedToConnectError("failed to connect") from e
raise FailedToConnectError("failed to connect") from e
else:
raise exc # type: ignore

if handle is None:
raise dbt.exceptions.FailedToConnectError("Failed to connect to Livy session. Common reasons for errors: \n1. Invalid/expired credentials (if using CLI authentication, re-run `az login` in your terminal) \n2. Invalid endpoint \n3. Invalid workspaceid or lakehouseid (do you have the correct permissions?) \n4. Invalid or non-existent shortcuts json path, or improperly formatted shortcuts")
raise FailedToConnectError(
"Failed to connect to Livy session. Common reasons for errors: \n1. Invalid/expired credentials (if using CLI authentication, re-run `az login` in your terminal) \n2. Invalid endpoint \n3. Invalid workspaceid or lakehouseid (do you have the correct permissions?) \n4. Invalid or non-existent shortcuts json path, or improperly formatted shortcuts"
)
connection.handle = handle
connection.state = ConnectionState.OPEN
return connection
Expand Down
Loading
Loading