Skip to content

[SPARK-45502][BUILD] Upgrade Kafka to 3.6.0 #6173

[SPARK-45502][BUILD] Upgrade Kafka to 3.6.0

[SPARK-45502][BUILD] Upgrade Kafka to 3.6.0 #6173

Triggered via push October 12, 2023 15:47
Status Failure
Total duration 2h 35m 7s
Artifacts 21

build_main.yml

on: push
Run  /  Check changes
39s
Run / Check changes
Run  /  Base image build
59s
Run / Base image build
Run  /  Breaking change detection with Buf (branch-3.5)
1m 5s
Run / Breaking change detection with Buf (branch-3.5)
Run  /  Run TPC-DS queries with SF=1
1h 44m
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
54m 53s
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
1h 22m
Run / Run Spark on Kubernetes Integration test
Matrix: Run / build
Matrix: Run / java-other-versions
Run  /  Build modules: sparkr
40m 27s
Run / Build modules: sparkr
Run  /  Linters, licenses, dependencies and documentation generation
1h 46m
Run / Linters, licenses, dependencies and documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

14 errors and 1 warning
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-1885648b24b7f51d-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-aa93168b24b9253f-exec-1".
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007f09505c6a90@7a886edf rejected from java.util.concurrent.ThreadPoolExecutor@49b8aa39[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 309]
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$679/0x00007f09505c6a90@6b896770 rejected from java.util.concurrent.ThreadPoolExecutor@49b8aa39[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 310]
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-efa6f18b24d086bf-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-ded20d8b24d1b3df-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-0487b68b24d5dbd5-exec-1".
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-3b5725fd3b8a4b8f9e757f2e85a63327-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-3b5725fd3b8a4b8f9e757f2e85a63327-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
Run / Build modules: core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx
The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled.
KafkaSourceStressSuite.stress test with multiple topics and partitions: KafkaSourceStressSuite#L2752
org.scalatest.exceptions.TestFailedException: Timed out waiting for stream: The code passed to failAfter did not complete within 30 seconds. java.base/java.lang.Thread.getStackTrace(Thread.java:1610) org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:277) org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231) org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230) org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69) org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$7(StreamTest.scala:481) org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$7$adapted(StreamTest.scala:480) scala.collection.mutable.HashMap$Node.foreach(HashMap.scala:642) scala.collection.mutable.HashMap.foreach(HashMap.scala:504) org.apache.spark.sql.streaming.StreamTest.fetchStreamAnswer$1(StreamTest.scala:480) Caused by: null java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:1764) org.apache.spark.sql.execution.streaming.StreamExecution.awaitOffset(StreamExecution.scala:481) org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$8(StreamTest.scala:482) scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127) org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282) org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231) org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230) org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69) org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$7(StreamTest.scala:481) == Progress == AssertOnQuery(<condition>, ) AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress3), data = Range 0 until 4, message = ) CheckAnswer: [1],[2],[3],[4] AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress3), data = Range 4 until 9, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9] AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress3), data = empty Range 9 until 9, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9] CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9] StopStream AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress3), data = Range 9 until 16, message = Add partition) AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress3), data = Range 16 until 20, message = ) AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress3), data = Range 20 until 21, message = Add partition) StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@9e2fae3,Map(),null) AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5, stress3), data = Range 21 until 25, message = Add topic stress6) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25] AddKafkaData(topics = HashSet(stress4, stress6, stress2, stress1, stress5, stress7, stress3), data = Range 25 until 31, message = Add topic stress7) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31] AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress7, stress3), data = Range 31 until 40, message = Delete topic stress6) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40] AddKafkaData(topics = HashSet(stress4, stress2, stress1, stress5, stress7, stress3), data = Range 40 until 44, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44] CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44] StopStream AddKafkaData(topics = HashSet(stress4, stress2, stress8, stress1, stress5, stress7, stress3), data = Range 44 until 45, message = Add topic stress8) AddKafkaData(topics = HashSet(stress4, stress2, stress8, stress1, stress5, stress7, stress3), data = Range 45 until 54, message = Add partition) AddKafkaData(topics = HashSet(stress4, stress2, stress8, stress1, stress5, stress7, stress3), data = Range 54 until 61, message = ) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3), data = empty Range 61 until 61, message = Add topic stress9) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 61 until 67, message = Add topic stress10) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 67 until 73, message = ) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 73 until 76, message = ) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 76 until 81, message = ) StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@7a342371,Map(),null) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 81 until 89, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89] CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89] StopStream AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = empty Range 89 until 89, message = Add partition) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 89 until 96, message = ) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 96 until 100, message = Add partition) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = Range 100 until 108, message = ) AddKafkaData(topics = HashSet(stress9, stress4, stress2, stress8, stress1, stress5, stress7, stress3, stress10), data = empty Range 108 until 108, message = ) StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@1e3dd4cf,Map(),null) AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress7, stress3, stress10), data = empty Range 108 until 108, message = Delete topic stress2) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108] CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108] AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 108 until 117, message = Add topic stress11) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117] AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 117 until 118, message = ) => CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118] AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 118 until 121, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121] AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 121 until 128, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128] AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 128 until 129, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129] AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 129 until 133, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133] AddKafkaData(topics = HashSet(stress9, stress4, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 133 until 135, message = Delete topic stress1) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135] AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 135 until 141, message = Add topic stress12) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141] AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 141 until 146, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141],[142],[143],[144],[145],[146] AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = empty Range 146 until 146, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141],[142],[143],[144],[145],[146] AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 146 until 151, message = Add partition) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141],[142],[143],[144],[145],[146],[147],[148],[149],[150],[151] AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 151 until 154, message = ) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141],[142],[143],[144],[145],[146],[147],[148],[149],[150],[151],[152],[153],[154] AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = empty Range 154 until 154, message = Add partition) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141],[142],[143],[144],[145],[146],[147],[148],[149],[150],[151],[152],[153],[154] CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141],[142],[143],[144],[145],[146],[147],[148],[149],[150],[151],[152],[153],[154] StopStream AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 154 until 161, message = ) AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 161 until 166, message = ) AddKafkaData(topics = HashSet(stress9, stress4, stress12, stress13, stress8, stress1, stress5, stress11, stress7, stress3, stress10), data = Range 166 until 174, message = Add topic stress13) StartStream(ProcessingTimeTrigger(0),org.apache.spark.util.SystemClock@1be7381e,Map(),null) CheckAnswer: [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15],[16],[17],[18],[19],[20],[21],[22],[23],[24],[25],[26],[27],[28],[29],[30],[31],[32],[33],[34],[35],[36],[37],[38],[39],[40],[41],[42],[43],[44],[45],[46],[47],[48],[49],[50],[51],[52],[53],[54],[55],[56],[57],[58],[59],[60],[61],[62],[63],[64],[65],[66],[67],[68],[69],[70],[71],[72],[73],[74],[75],[76],[77],[78],[79],[80],[81],[82],[83],[84],[85],[86],[87],[88],[89],[90],[91],[92],[93],[94],[95],[96],[97],[98],[99],[100],[101],[102],[103],[104],[105],[106],[107],[108],[109],[110],[111],[112],[113],[114],[115],[116],[117],[118],[119],[120],[121],[122],[123],[124],[125],[126],[127],[128],[129],[130],[131],[132],[133],[134],[135],[136],[137],[138],[139],[140],[141],[142],[143],[144],[145],[146],[147],[148],[149],[150],[151],[152],[153],[154],[155],[156],[157],[158],[159],[160],[161],[162],[163],[164],[165],[166],[167],[168],[169],[170],[171],[172],[173],[174] == Stream == Output Mode: Append Stream state: {KafkaV2[SubscribePattern[stress.*]]: {"stress8":{"0":0,"1":0,"2":2,"3":0,"4":1,"5":2,"6":1,"7":2,"8":1,"9":0,"10":0,"11":1,"12":0,"13":0,"14":1,"15":0,"16":1,"17":1},"stress9":{"0":1,"1":0,"2":4,"3":1,"4":3,"5":1,"6":0,"7":1},"stress11":{"0":0,"1":0,"2":0},"stress4":{"0":5,"1":0,"2":1,"3":0,"4":1,"5":0,"6":0,"7":0,"8":0,"9":1,"10":2,"11":0,"12":0,"13":1,"14":0,"15":0,"16":0,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0,"24":1,"25":0,"26":0,"27":0,"28":0,"29":0},"stress10":{"0":2,"1":2,"2":3,"3":3,"4":3,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0},"stress5":{"0":4,"1":6,"2":3,"3":0,"4":0,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress7":{"0":4,"1":4,"2":6,"3":2,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"stress1":{"0":1,"1":5,"2":4,"3":4,"4":2,"5":1,"6":5,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":0},"stress3":{"0":9,"1":0,"2":3,"3":0,"4":0,"5":1,"6":2,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0}}} Thread state: alive Thread stack trace: java.base@17.0.8/java.lang.Thread.sleep(Native Method) app//org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:348) app//org.apache.spark.sql.execution.streaming.MicroBatchExecution$$Lambda$5644/0x00007f7b09490550.apply$mcZ$sp(Unknown Source) app//org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:67) app//org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:279) app//org.apache.spark.sql.execution.streaming.StreamExecution.$anonfun$runStream$1(StreamExecution.scala:311) app//org.apache.spark.sql.execution.streaming.StreamExecution$$Lambda$5631/0x00007f7b0948ab00.apply$mcV$sp(Unknown Source) app//scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) app//org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:901) app//org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:289) app//org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.$anonfun$run$1(StreamExecution.scala:211) app//org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1$$Lambda$5626/0x00007f7b094897e8.apply$mcV$sp(Unknown Source) app//scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) app//org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:94) app//org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:211) == Sink == 0: 1: [1] [2] [3] 2: [4] 3: [6] [5] [8] [7] [9] 4: [15] [13] [17] [19] [11] [10] [12] [14] [21] [18] [16] [20] 5: [25] [22] [23] [24] 6: 7: [26] [27] [29] [31] [28] [30] 8: 9: [33] [37] [39] [32] [34] [36] [38] [35] [40] 10: [41] [43] [42] [44] 11: [78] [68] [60] [52] [63] [75] [57] [59] [61] [72] [77] [79] [81] [74] [45] [49] [53] [65] [55] [58] [69] [71] [47] [51] [64] [67] [73] [80] [76] [46] [48] [50] [54] [62] [66] [70] [56] 12: [83] [88] [86] [82] [85] [84] [87] 13: [89] 14: [100] [104] [94] [105] [106] [91] [93] [95] [98] [92] [103] [90] [102] [107] [101] [108] [96] [97] [99] 15: 16: 17: [117] [109] [116] [113] [111] [115] [112] [114] [110] 18: 19: [118] == Plan == == Parsed Logical Plan == WriteToMicroBatchDataSource MemorySink, e6389d0e-f7b4-4d4c-97e1-0090c018e97e, Append, 19 +- SerializeFromObject [input[0, int, false] AS value#38065] +- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8272/0x00007f7b099d20b8@796e1dbc, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#38064: int +- DeserializeToObject newInstance(class scala.Tuple2), obj#38063: scala.Tuple2 +- Project [cast(key#38039 as string) AS key#38053, cast(value#38040 as string) AS value#38054] +- StreamingDataSourceV2Relation [key#38039, value#38040, topic#38041, partition#38042, offset#38043L, timestamp#38044, timestampType#38045], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@62e67e00, KafkaV2[SubscribePattern[stress.*]], {"stress8":{"0":0,"1":0,"2":2,"3":0,"4":1,"5":2,"6":1,"7":2,"8":1,"9":0,"10":0,"11":1,"12":0,"13":0,"14":1,"15":0,"16":1,"17":1},"stress9":{"0":1,"1":0,"2":4,"3":1,"4":3,"5":1,"6":0,"7":1},"stress11":{"0":0,"1":0,"2":0},"stress4":{"0":5,"1":0,"2":1,"3":0,"4":1,"5":0,"6":0,"7":0,"8":0,"9":1,"10":2,"11":0,"12":0,"13":1,"14":0,"15":0,"16":0,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0,"24":1,"25":0,"26":0,"27":0,"28":0,"29":0},"stress10":{"0":2,"1":2,"2":3,"3":3,"4":3,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0},"stress5":{"0":4,"1":6,"2":3,"3":0,"4":0,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress7":{"0":4,"1":4,"2":6,"3":2,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"stress1":{"0":1,"1":5,"2":4,"3":4,"4":2,"5":1,"6":5,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":0},"stress3":{"0":9,"1":0,"2":3,"3":0,"4":0,"5":1,"6":2,"7":0,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0}}, {"stress8":{"0":0,"1":0,"2":2,"3":0,"4":1,"5":2,"6":1,"7":2,"8":1,"9":0,"10":0,"11":1,"12":0,"13":0,"14":1,"15":0,"16":1,"17":1},"stress9":{"0":1,"1":0,"2":4,"3":1,"4":3,"5":1,"6":0,"7":1},"stress11":{"0":0,"1":0,"2":0},"stress4":{"0":5,"1":0,"2":1,"3":0,"4":1,"5":0,"6":0,"7":0,"8":0,"9":1,"10":2,"11":0,"12":0,"13":1,"14":0,"15":0,"16":0,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0,"24":1,"25":0,"26":0,"27":0,"28":0,"29":0},"stress10":{"0":2,"1":2,"2":3,"3":3,"4":3,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0},"stress5":{"0":4,"1":6,"2":3,"3":0,"4":0,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress7":{"0":4,"1":4,"2":6,"3":2,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"stress1":{"0":1,"1":5,"2":4,"3":4,"4":2,"5":1,"6":5,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":0},"stress3":{"0":9,"1":0,"2":3,"3":0,"4":0,"5":1,"6":2,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0}} == Analyzed Logical Plan == WriteToMicroBatchDataSource MemorySink, e6389d0e-f7b4-4d4c-97e1-0090c018e97e, Append, 19 +- SerializeFromObject [input[0, int, false] AS value#38065] +- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8272/0x00007f7b099d20b8@796e1dbc, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#38064: int +- DeserializeToObject newInstance(class scala.Tuple2), obj#38063: scala.Tuple2 +- Project [cast(key#38039 as string) AS key#38053, cast(value#38040 as string) AS value#38054] +- StreamingDataSourceV2Relation [key#38039, value#38040, topic#38041, partition#38042, offset#38043L, timestamp#38044, timestampType#38045], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@62e67e00, KafkaV2[SubscribePattern[stress.*]], {"stress8":{"0":0,"1":0,"2":2,"3":0,"4":1,"5":2,"6":1,"7":2,"8":1,"9":0,"10":0,"11":1,"12":0,"13":0,"14":1,"15":0,"16":1,"17":1},"stress9":{"0":1,"1":0,"2":4,"3":1,"4":3,"5":1,"6":0,"7":1},"stress11":{"0":0,"1":0,"2":0},"stress4":{"0":5,"1":0,"2":1,"3":0,"4":1,"5":0,"6":0,"7":0,"8":0,"9":1,"10":2,"11":0,"12":0,"13":1,"14":0,"15":0,"16":0,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0,"24":1,"25":0,"26":0,"27":0,"28":0,"29":0},"stress10":{"0":2,"1":2,"2":3,"3":3,"4":3,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0},"stress5":{"0":4,"1":6,"2":3,"3":0,"4":0,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress7":{"0":4,"1":4,"2":6,"3":2,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"stress1":{"0":1,"1":5,"2":4,"3":4,"4":2,"5":1,"6":5,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":0},"stress3":{"0":9,"1":0,"2":3,"3":0,"4":0,"5":1,"6":2,"7":0,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0}}, {"stress8":{"0":0,"1":0,"2":2,"3":0,"4":1,"5":2,"6":1,"7":2,"8":1,"9":0,"10":0,"11":1,"12":0,"13":0,"14":1,"15":0,"16":1,"17":1},"stress9":{"0":1,"1":0,"2":4,"3":1,"4":3,"5":1,"6":0,"7":1},"stress11":{"0":0,"1":0,"2":0},"stress4":{"0":5,"1":0,"2":1,"3":0,"4":1,"5":0,"6":0,"7":0,"8":0,"9":1,"10":2,"11":0,"12":0,"13":1,"14":0,"15":0,"16":0,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0,"24":1,"25":0,"26":0,"27":0,"28":0,"29":0},"stress10":{"0":2,"1":2,"2":3,"3":3,"4":3,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0},"stress5":{"0":4,"1":6,"2":3,"3":0,"4":0,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress7":{"0":4,"1":4,"2":6,"3":2,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"stress1":{"0":1,"1":5,"2":4,"3":4,"4":2,"5":1,"6":5,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":0},"stress3":{"0":9,"1":0,"2":3,"3":0,"4":0,"5":1,"6":2,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0}} == Optimized Logical Plan == WriteToDataSourceV2 MicroBatchWrite[epoch: 19, writer: org.apache.spark.sql.execution.streaming.sources.MemoryStreamingWrite@581a4c04] +- SerializeFromObject [input[0, int, false] AS value#38065] +- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8272/0x00007f7b099d20b8@796e1dbc, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#38064: int +- DeserializeToObject newInstance(class scala.Tuple2), obj#38063: scala.Tuple2 +- Project [cast(key#38039 as string) AS key#38053, cast(value#38040 as string) AS value#38054] +- StreamingDataSourceV2Relation [key#38039, value#38040, topic#38041, partition#38042, offset#38043L, timestamp#38044, timestampType#38045], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@62e67e00, KafkaV2[SubscribePattern[stress.*]], {"stress8":{"0":0,"1":0,"2":2,"3":0,"4":1,"5":2,"6":1,"7":2,"8":1,"9":0,"10":0,"11":1,"12":0,"13":0,"14":1,"15":0,"16":1,"17":1},"stress9":{"0":1,"1":0,"2":4,"3":1,"4":3,"5":1,"6":0,"7":1},"stress11":{"0":0,"1":0,"2":0},"stress4":{"0":5,"1":0,"2":1,"3":0,"4":1,"5":0,"6":0,"7":0,"8":0,"9":1,"10":2,"11":0,"12":0,"13":1,"14":0,"15":0,"16":0,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0,"24":1,"25":0,"26":0,"27":0,"28":0,"29":0},"stress10":{"0":2,"1":2,"2":3,"3":3,"4":3,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0},"stress5":{"0":4,"1":6,"2":3,"3":0,"4":0,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress7":{"0":4,"1":4,"2":6,"3":2,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"stress1":{"0":1,"1":5,"2":4,"3":4,"4":2,"5":1,"6":5,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":0},"stress3":{"0":9,"1":0,"2":3,"3":0,"4":0,"5":1,"6":2,"7":0,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0}}, {"stress8":{"0":0,"1":0,"2":2,"3":0,"4":1,"5":2,"6":1,"7":2,"8":1,"9":0,"10":0,"11":1,"12":0,"13":0,"14":1,"15":0,"16":1,"17":1},"stress9":{"0":1,"1":0,"2":4,"3":1,"4":3,"5":1,"6":0,"7":1},"stress11":{"0":0,"1":0,"2":0},"stress4":{"0":5,"1":0,"2":1,"3":0,"4":1,"5":0,"6":0,"7":0,"8":0,"9":1,"10":2,"11":0,"12":0,"13":1,"14":0,"15":0,"16":0,"17":0,"18":1,"19":0,"20":0,"21":0,"22":1,"23":0,"24":1,"25":0,"26":0,"27":0,"28":0,"29":0},"stress10":{"0":2,"1":2,"2":3,"3":3,"4":3,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0},"stress5":{"0":4,"1":6,"2":3,"3":0,"4":0,"5":1,"6":0,"7":0,"8":0,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress7":{"0":4,"1":4,"2":6,"3":2,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"stress1":{"0":1,"1":5,"2":4,"3":4,"4":2,"5":1,"6":5,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0},"stress2":{"0":0},"stress3":{"0":9,"1":0,"2":3,"3":0,"4":0,"5":1,"6":2,"7":1,"8":1,"9":0,"10":0,"11":0,"12":0,"13":0,"14":0,"15":0,"16":0,"17":0,"18":0,"19":0,"20":0,"21":0}} == Physical Plan == WriteToDataSourceV2 MicroBatchWrite[epoch: 19, writer: org.apache.spark.sql.execution.streaming.sources.MemoryStreamingWrite@581a4c04], org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy$$Lambda$5748/0x00007f7b094bf130@6f6ffb03 +- *(1) SerializeFromObject [input[0, int, false] AS value#38065] +- *(1) MapElements org.apache.spark.sql.kafka010.KafkaSourceStressSuite$$Lambda$8272/0x00007f7b099d20b8@796e1dbc, obj#38064: int +- *(1) DeserializeToObject newInstance(class scala.Tuple2), obj#38063: scala.Tuple2 +- *(1) Project [cast(key#38039 as string) AS key#38053, cast(value#38040 as string) AS value#38054] +- MicroBatchScan[key#38039, value#38040, topic#38041, partition#38042, offset#38043L, timestamp#38044, timestampType#38045] class org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan
Run / Build modules: pyspark-errors
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.

Artifacts

Produced during runtime
Name Size
test-results-catalyst, hive-thriftserver--17-hadoop3-hive2.3 Expired
2.79 MB
test-results-docker-integration--17-hadoop3-hive2.3 Expired
119 KB
test-results-hive-- other tests-17-hadoop3-hive2.3 Expired
911 KB
test-results-hive-- slow tests-17-hadoop3-hive2.3 Expired
853 KB
test-results-pyspark-connect--17-hadoop3-hive2.3 Expired
412 KB
test-results-pyspark-core, pyspark-streaming--17-hadoop3-hive2.3 Expired
80.5 KB
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3 Expired
1.42 MB
test-results-pyspark-pandas--17-hadoop3-hive2.3 Expired
1.14 MB
test-results-pyspark-pandas-connect-part0--17-hadoop3-hive2.3 Expired
1.06 MB
test-results-pyspark-pandas-connect-part1--17-hadoop3-hive2.3 Expired
972 KB
test-results-pyspark-pandas-connect-part2--17-hadoop3-hive2.3 Expired
637 KB
test-results-pyspark-pandas-connect-part3--17-hadoop3-hive2.3 Expired
326 KB
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3 Expired
1.85 MB
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3 Expired
392 KB
test-results-sparkr--17-hadoop3-hive2.3 Expired
280 KB
test-results-sql-- extended tests-17-hadoop3-hive2.3 Expired
2.96 MB
test-results-sql-- other tests-17-hadoop3-hive2.3 Expired
4.24 MB
test-results-sql-- slow tests-17-hadoop3-hive2.3 Expired
2.76 MB
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3 Expired
323 KB
test-results-tpcds--17-hadoop3-hive2.3 Expired
21.8 KB
unit-tests-log-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3 Expired
230 MB