Skip to content

Update core/src/main/scala/org/apache/spark/internal/config/ConfigEnt… #79

Update core/src/main/scala/org/apache/spark/internal/config/ConfigEnt…

Update core/src/main/scala/org/apache/spark/internal/config/ConfigEnt… #79

Re-run triggered June 21, 2024 06:30
Status Failure
Total duration 57m 32s
Artifacts 25

build_main.yml

on: push
Run  /  Check changes
33s
Run / Check changes
Run  /  Base image build
48s
Run / Base image build
Run  /  Protobuf breaking change detection and Python CodeGen check
1m 50s
Run / Protobuf breaking change detection and Python CodeGen check
Run  /  Run TPC-DS queries with SF=1
1h 25m
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
41m 31s
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
57m 22s
Run / Run Spark on Kubernetes Integration test
Run  /  Run Spark UI tests
19s
Run / Run Spark UI tests
Matrix: Run / build
Run  /  Build modules: sparkr
27m 25s
Run / Build modules: sparkr
Run  /  Linters, licenses, and dependencies
30m 12s
Run / Linters, licenses, and dependencies
Run  /  Documentation generation
34m 46s
Run / Documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

12 errors and 1 warning
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect
The job running on runner GitHub Actions 7 has exceeded the maximum execution time of 180 minutes.
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect
The operation was canceled.
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-bd55e29039945f60-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-98cf789039953cba-exec-1".
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$622/0x00007f168c4fa1a0@558b2204 rejected from java.util.concurrent.ThreadPoolExecutor@1440ea06[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 382]
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$622/0x00007f168c4fa1a0@cd05438 rejected from java.util.concurrent.ThreadPoolExecutor@1440ea06[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 383]
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-25e8019039a7285e-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-06ca659039a809cc-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-ea86b09039ab9fa9-exec-1".
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-9aedf5ef308345ebb62d4540450eadca-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-9aedf5ef308345ebb62d4540450eadca-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.