Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

writing with mode "append" to an existing table only rolls back faulty rows w/o "NO_DUPLICATES" #236

Open
m-freitag opened this issue Aug 10, 2023 · 0 comments

Comments

@m-freitag
Copy link

Using the latest beta version of the connector, issuing an "append" to an existing table only properly rolls back rows after an error in a row occurs. Setting reliabilityLevel to "NO_DUPLICATES" works. However, the Error message raised in this case is intrackable.

Take an arbitrary HEAP table with some constraint (e.g. primary key) or non-nullability set causes all rows rows up to a faulty row (e.g. Null) to be inserted:

df.write \
    .format("com.microsoft.sqlserver.jdbc.spark") \
    .mode("append") \
    .option("url", "url") \
    .option("dbtable", "table") \
    .option("schemaCheckEnabled", False) \
    .option("tableLock", True) \
    .option("BatchSize", 1000) \
    .save()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant