You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a minor bug where I see that the number of rows of a table is doubled in the spark UI. To reproduce, simply create a snowflake table with any number of rows, then, with a spark session do the following:
val dataframe = spark
.read
.format(SNOWFLAKE_SOURCE_NAME)
.options(...)
.load()
dataframe.orderBy(rand()).collect()
If you look at the spark UI in the SQL / Dataframe tab and go into the latest collect row, you should see that the number of rows shown is doubled.
The text was updated successfully, but these errors were encountered:
I have a minor bug where I see that the number of rows of a table is doubled in the spark UI. To reproduce, simply create a snowflake table with any number of rows, then, with a spark session do the following:
If you look at the spark UI in the
SQL / Dataframe
tab and go into the latestcollect
row, you should see that the number of rows shown is doubled.The text was updated successfully, but these errors were encountered: