[BUG]: UsedTable
incorrectly sets is_read
for spark.range(10).saveAsTable("a.b")
#2893
Closed
1 task done
Labels
internal
this pull request won't appear in release notes
Is there an existing issue for this?
Current Behavior
spark.range(10).saveAsTable("a.b")
presents tablea.b
withis_read=True
and it has to be falseExpected Behavior
No response
Steps To Reproduce
No response
Cloud
AWS
Operating System
macOS
Version
latest via Databricks CLI
Relevant log output
No response
The text was updated successfully, but these errors were encountered: