Skip to content

Conversation

@adrian-wang
Copy link
Contributor

In JVM 1.8.0, MaxPermSize is no longer supported.
In spark stderr output, there would be a line of

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0

@SparkQA
Copy link

SparkQA commented Aug 15, 2014

QA tests have started for PR 1967 at commit 9c32941.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Aug 15, 2014

QA tests have finished for PR 1967 at commit 9c32941.

  • This patch fails unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is all this code and munging in case JAVA_HOME is not the same Java as the Worker was started with? Otherwise it seems we could just do sys.env("java.version").

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are now using the java which the worker started with to run executor, so to judge perm size setting we should refer to the specific java version.

@adrian-wang adrian-wang deleted the maxpermsize branch August 18, 2014 05:09
@adrian-wang
Copy link
Contributor Author

Hi @aarondav Could you please help review this in PR #2011 ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants