You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wish we can support Count on Struct of [ Struct of [String, Map(String,String)], Array(String), Map(String,String) ].
Example:
import org.apache.spark.sql.Row
import org.apache.spark.sql.types._
val data = Seq(
Row("Adam", Row( Row("address" , Map("state"->"CA")), List("Java","Scala","C++"), Map("hair"->"black","eye"->"black") ) ),
Row("Bob", Row( Row("address" , Map("state"->"NY")), List("Java","Python","C++"), Map("hair"->"red","eye"->"black") ) ),
Row("Cathy", Row( Row("address" , Map("state"->"CO")), List("Java","C++"), Map("hair"->"yellow","eye"->"black") ) )
)
val mapType = DataTypes.createMapType(StringType,StringType)
val schema = new StructType()
.add("name",StringType)
.add("prop", new StructType().add("prop2",new StructType()
.add("propname",StringType)
.add("propvalue",mapType)
)
.add("language", ArrayType(StringType))
.add("bio", mapType)
)
val nestedDF = spark.createDataFrame(spark.sparkContext.parallelize(data),schema)
nestedDF.write.format("parquet").mode("overwrite").save("/tmp/testparquet")
val df1 = spark.read.parquet("/tmp/testparquet")
df1.createOrReplaceTempView("df1")
df1.printSchema
spark.sql("SELECT count(prop) FROM df1").collect()
Not-Supported messages:
!Expression <Count> count(prop#364) cannot run on GPU because input expression AttributeReference prop#364 (child StructType(StructField(propname,StringType,true), StructField(propvalue,MapType(StringType,StringType,true),true)) is not supported, child ArrayType(StringType,true) is not supported, child MapType(StringType,StringType,true) is not supported)
The text was updated successfully, but these errors were encountered:
I wish we can support Count on Struct of [ Struct of [String, Map(String,String)], Array(String), Map(String,String) ].
Example:
Not-Supported messages:
The text was updated successfully, but these errors were encountered: