Skip to content

Commit

Permalink
[SPARK-28881][PYTHON][TESTS][FOLLOW-UP] Use SparkSession(SparkContext…
Browse files Browse the repository at this point in the history
…(...)) to prevent for Spark conf to affect other tests

### What changes were proposed in this pull request?

This PR proposes to match the test with branch-2.4. See #25593 (comment)

Seems using `SparkSession.builder` with Spark conf possibly affects other tests.

### Why are the changes needed?
To match with branch-2.4 and to make easier to backport.

### Does this PR introduce any user-facing change?
No.

### How was this patch tested?
Test was fixed.

Closes #25603 from HyukjinKwon/SPARK-28881-followup.

Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
  • Loading branch information
HyukjinKwon committed Aug 28, 2019
1 parent 90b10b4 commit 8848af2
Showing 1 changed file with 3 additions and 5 deletions.
8 changes: 3 additions & 5 deletions python/pyspark/sql/tests/test_arrow.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import unittest
import warnings

from pyspark import SparkContext, SparkConf
from pyspark.sql import Row, SparkSession
from pyspark.sql.functions import udf
from pyspark.sql.types import *
Expand Down Expand Up @@ -430,11 +431,8 @@ class MaxResultArrowTests(unittest.TestCase):

@classmethod
def setUpClass(cls):
cls.spark = SparkSession.builder \
.master("local[4]") \
.appName(cls.__name__) \
.config("spark.driver.maxResultSize", "10k") \
.getOrCreate()
cls.spark = SparkSession(SparkContext(
'local[4]', cls.__name__, conf=SparkConf().set("spark.driver.maxResultSize", "10k")))

# Explicitly enable Arrow and disable fallback.
cls.spark.conf.set("spark.sql.execution.arrow.pyspark.enabled", "true")
Expand Down

0 comments on commit 8848af2

Please sign in to comment.