Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Creating tables with all partition columns is disallowed, but dropping last non-partition column is allowed #1929

Open
2 of 6 tasks
ebyhr opened this issue Jul 24, 2023 · 0 comments
Labels
bug Something isn't working

Comments

@ebyhr
Copy link
Contributor

ebyhr commented Jul 24, 2023

Bug

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Describe the problem

Steps to reproduce

Please include copy-pastable code snippets if possible.

  1. create table default.test (c1 int, c2 int) using delta partitioned by (c2) TBLPROPERTIES('delta.columnMapping.mode'='name');
  2. alter table default.test drop column c1;

Observed results

Creating tables with all partition columns (e.g. create table default.test2 (a int) using delta partitioned by (c2)) throws "Cannot use all columns for partition columns" error message, but dropping last non-partition columns succeeds.

Expected results

Throw exceptions when the target column of DROP COLUMN is the last non-partition column.

Further details

Environment information

  • Delta Lake version: 2.3.0
  • Spark version: 3.3.2
  • Scala version: 2.12

Willingness to contribute

The Delta Lake Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the Delta Lake code base?

  • No. I cannot contribute a bug fix at this time.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant