Skip to content

Conversation

@HyukjinKwon
Copy link
Member

@HyukjinKwon HyukjinKwon commented Dec 13, 2018

What changes were proposed in this pull request?

Multiple SparkContexts are discouraged and it has been warning for last 4 years, see SPARK-4180. It could cause arbitrary and mysterious error cases, see SPARK-2243.

Honestly, I didn't even know Spark still allows it, which looks never officially supported, see SPARK-2243.

I believe It should be good timing now to remove this configuration.

How was this patch tested?

Each doc was manually checked and manually tested:

$ ./bin/spark-shell --conf=spark.driver.allowMultipleContexts=true
...
scala> new SparkContext()
org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
...
org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2435)
  at scala.Option.foreach(Option.scala:274)
  at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2432)
  at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2509)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:112)
  ... 49 elided

@HyukjinKwon
Copy link
Member Author

adding @srowen, @JoshRosen, @rxin

@SparkQA

This comment has been minimized.

@srowen
Copy link
Member

srowen commented Dec 13, 2018

Honestly I think we can remove this. It's been bad practice for years, and keeping the support means it stays in Spark for years. This mode doesn't really work.

@HyukjinKwon
Copy link
Member Author

Yea, I actually wanted to remove this but made it deprecated in case some people have a different view. +1 for just removing out. Let me update it tomorrow if there's no comment against just removing out.

@rxin
Copy link
Contributor

rxin commented Dec 13, 2018 via email

@apache apache deleted a comment from AmplabJenkins Dec 14, 2018
@SparkQA

This comment has been minimized.

@HyukjinKwon
Copy link
Member Author

[error]  * method setActiveContext(org.apache.spark.SparkContext,Boolean)Unit in object org.apache.spark.SparkContext does not have a correspondent in current version
[error]    filter with: ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SparkContext.setActiveContext")
[error]  * method markPartiallyConstructed(org.apache.spark.SparkContext,Boolean)Unit in object org.apache.spark.SparkContext does not have a correspondent in current version
[error]    filter with: ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SparkContext.markPartiallyConstructed")

Both are not public APIs. I'm going to exclude this in MiMa.

@gatorsmile
Copy link
Member

cc @jiangxb1987

@HyukjinKwon HyukjinKwon changed the title [SPARK-26362][CORE] Deprecate 'spark.driver.allowMultipleContexts' to discourage multiple creation of SparkContexts [SPARK-26362][CORE] Remove 'spark.driver.allowMultipleContexts' to discourage multiple creation of SparkContexts Dec 14, 2018
@SparkQA

This comment has been minimized.

@HeartSaVioR
Copy link
Contributor

+1 on remove it, and IMHO might be better to deprecate it in active 2.x version line.

@HyukjinKwon HyukjinKwon changed the title [SPARK-26362][CORE] Remove 'spark.driver.allowMultipleContexts' to discourage multiple creation of SparkContexts [SPARK-26362][CORE] Remove 'spark.driver.allowMultipleContexts' to disallow multiple creation of SparkContexts Dec 14, 2018
@SparkQA

This comment has been minimized.

@HyukjinKwon
Copy link
Member Author

retest this please

@SparkQA
Copy link

SparkQA commented Dec 14, 2018

Test build #100126 has finished for PR 23311 at commit 6482e14.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 14, 2018

Test build #100131 has finished for PR 23311 at commit 281cc38.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member Author

retest this please

@SparkQA
Copy link

SparkQA commented Dec 14, 2018

Test build #100142 has finished for PR 23311 at commit 281cc38.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member Author

Merged to master.

@asfgit asfgit closed this in 9ccae0c Dec 15, 2018
@jiangxb1987
Copy link
Contributor

late LGTM :)

holdenk pushed a commit to holdenk/spark that referenced this pull request Jan 5, 2019
…sallow multiple creation of SparkContexts

## What changes were proposed in this pull request?

Multiple SparkContexts are discouraged and it has been warning for last 4 years, see SPARK-4180. It could cause arbitrary and mysterious error cases, see SPARK-2243.

Honestly, I didn't even know Spark still allows it, which looks never officially supported, see SPARK-2243.

I believe It should be good timing now to remove this configuration.

## How was this patch tested?

Each doc was manually checked and manually tested:

```
$ ./bin/spark-shell --conf=spark.driver.allowMultipleContexts=true
...
scala> new SparkContext()
org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
...
org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2435)
  at scala.Option.foreach(Option.scala:274)
  at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2432)
  at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2509)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:112)
  ... 49 elided
```

Closes apache#23311 from HyukjinKwon/SPARK-26362.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
jackylee-ch pushed a commit to jackylee-ch/spark that referenced this pull request Feb 18, 2019
…sallow multiple creation of SparkContexts

## What changes were proposed in this pull request?

Multiple SparkContexts are discouraged and it has been warning for last 4 years, see SPARK-4180. It could cause arbitrary and mysterious error cases, see SPARK-2243.

Honestly, I didn't even know Spark still allows it, which looks never officially supported, see SPARK-2243.

I believe It should be good timing now to remove this configuration.

## How was this patch tested?

Each doc was manually checked and manually tested:

```
$ ./bin/spark-shell --conf=spark.driver.allowMultipleContexts=true
...
scala> new SparkContext()
org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
...
org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2435)
  at scala.Option.foreach(Option.scala:274)
  at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2432)
  at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2509)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:112)
  ... 49 elided
```

Closes apache#23311 from HyukjinKwon/SPARK-26362.

Authored-by: Hyukjin Kwon <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
@HyukjinKwon HyukjinKwon deleted the SPARK-26362 branch March 3, 2020 01:20
Ennosigaeon pushed a commit to Ennosigaeon/spark-jobserver that referenced this pull request Apr 8, 2021
…ingle JVM

In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
Ennosigaeon pushed a commit to Ennosigaeon/spark-jobserver that referenced this pull request May 5, 2021
…ingle JVM

In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
Ennosigaeon pushed a commit to Ennosigaeon/spark-jobserver that referenced this pull request May 5, 2021
…ingle JVM

In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
Ennosigaeon pushed a commit to Ennosigaeon/spark-jobserver that referenced this pull request May 28, 2021
…ingle JVM

In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
Ennosigaeon pushed a commit to Ennosigaeon/spark-jobserver that referenced this pull request Oct 31, 2023
…ingle JVM

In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants