Skip to content

Commit

Permalink
[SPARK-44916][DOCS][TESTS] Document Spark Driver Live Log UI
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

This PR aims to document `Spark Driver Live Log UI`. In addition, this PR fixed a test case to clean up the test directory properly.

### Why are the changes needed?

To help a user to use this feature easily, especially in K8s environment.

**1. `Spark Configuration` page**
![Screenshot 2023-08-22 at 1 48 27 PM](https://github.com/apache/spark/assets/9700541/cccbbdb4-9bde-43bc-af0c-8b182436c4bb)

**2. `Running Spark on Kubernetes` page**
![Screenshot 2023-08-22 at 1 48 46 PM](https://github.com/apache/spark/assets/9700541/fb186cdb-95f7-4ce5-bb57-db6d5621a50a)

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Manual review.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#42615 from dongjoon-hyun/SPARK-44916.

Lead-authored-by: Dongjoon Hyun <dhyun@apple.com>
Co-authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
  • Loading branch information
dongjoon-hyun and dongjoon-hyun committed Aug 22, 2023
1 parent ce50a56 commit 9b75768
Show file tree
Hide file tree
Showing 3 changed files with 25 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,9 @@ class DriverLoggerSuite extends SparkFunSuite with LocalSparkContext {
test("SPARK-44214: DriverLogger.apply returns None when only spark.driver.log.localDir exists") {
val sparkConf = new SparkConf()
assert(DriverLogger(sparkConf).isEmpty)
assert(DriverLogger(sparkConf.set(DRIVER_LOG_LOCAL_DIR, "file://tmp/")).isEmpty)
withTempDir { dir =>
assert(DriverLogger(sparkConf.set(DRIVER_LOG_LOCAL_DIR, dir.getCanonicalPath)).isEmpty)
}
}

private def getSparkContext(): SparkContext = {
Expand Down
10 changes: 9 additions & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -430,6 +430,14 @@ of the most common options to set are:
</td>
<td>1.3.0</td>
</tr>
<tr>
<td><code>spark.driver.log.localDir</code></td>
<td>(none)</td>
<td>
Specifies a local directory to write driver logs and enable Driver Log UI Tab.
</td>
<td>4.0.0</td>
</tr>
<tr>
<td><code>spark.driver.log.dfsDir</code></td>
<td>(none)</td>
Expand Down Expand Up @@ -460,7 +468,7 @@ of the most common options to set are:
<td><code>spark.driver.log.layout</code></td>
<td>%d{yy/MM/dd HH:mm:ss.SSS} %t %p %c{1}: %m%n%ex</td>
<td>
The layout for the driver logs that are synced to <code>spark.driver.log.dfsDir</code>. If this is not configured,
The layout for the driver logs that are synced to <code>spark.driver.log.localDir</code> and <code>spark.driver.log.dfsDir</code>. If this is not configured,
it uses the layout for the first appender defined in log4j2.properties. If that is also not configured, driver logs
use the default layout.
</td>
Expand Down
13 changes: 13 additions & 0 deletions docs/running-on-kubernetes.md
Original file line number Diff line number Diff line change
Expand Up @@ -441,6 +441,19 @@ $ kubectl port-forward <driver-pod-name> 4040:4040

Then, the Spark driver UI can be accessed on `http://localhost:4040`.

Since Apache Spark 4.0.0, Driver UI provides a way to see driver logs via a new configuration.

```
spark.driver.log.localDir=/tmp
```

Then, the Spark driver UI can be accessed on `http://localhost:4040/logs/`.
Optionally, the layout of log is configured by the following.

```
spark.driver.log.layout="%m%n%ex"
```

### Debugging

There may be several kinds of failures. If the Kubernetes API server rejects the request made from spark-submit, or the
Expand Down

0 comments on commit 9b75768

Please sign in to comment.