Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-1550] Fixed - Successive creation of spark context fails in pyspark, if the previous initialization of spark context had failed. #478

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions python/pyspark/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,11 @@ def __init__(self, master=None, appName=None, sparkHome=None, pyFiles=None,


>>> from pyspark.context import SparkContext
>>> s1 = SparkContext('local') # doctest: +IGNORE_EXCEPTION_DETAIL
Traceback (most recent call last):
...
Exception:...

>>> sc = SparkContext('local', 'test')

>>> sc2 = SparkContext('local', 'test2') # doctest: +IGNORE_EXCEPTION_DETAIL
Expand Down Expand Up @@ -116,8 +121,12 @@ def __init__(self, master=None, appName=None, sparkHome=None, pyFiles=None,

# Check that we have at least the required parameters
if not self._conf.contains("spark.master"):
with SparkContext._lock:
SparkContext._active_spark_context = None
raise Exception("A master URL must be set in your configuration")
if not self._conf.contains("spark.app.name"):
with SparkContext._lock:
SparkContext._active_spark_context = None
raise Exception("An application name must be set in your configuration")

# Read back our properties from the conf in case we loaded some of them from
Expand Down