-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-2678][Core] Added "--" to prevent spark-submit from shadowing application options #1715
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
2774138
e246367
6c23c52
b8059c0
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -26,11 +26,16 @@ set -o posix | |
| # Figure out where Spark is installed | ||
| FWDIR="$(cd `dirname $0`/..; pwd)" | ||
|
|
||
| if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then | ||
| echo "Usage: ./sbin/spark-sql [options]" | ||
| $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2 | ||
| CLASS="org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver" | ||
|
|
||
| if [[ "$@" = --help ]] || [[ "$@" = -h ]]; then | ||
| echo "Usage: ./sbin/spark-sql [options] [--] [CLI options]" | ||
| exec "$FWDIR"/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2 | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Actually if redirection is used, But indeed, I didn't notice this issue while adding this change. Should use |
||
| echo | ||
| echo "CLI options:" | ||
| exec "$FWDIR"/bin/spark-submit spark-internal --class $CLASS -- -H 2>&1 | tail -n +3 | ||
| echo | ||
| exit 0 | ||
| fi | ||
|
|
||
| CLASS="org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver" | ||
| exec "$FWDIR"/bin/spark-submit --class $CLASS spark-internal $@ | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -311,6 +311,15 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) { | |
| verbose = true | ||
| parse(tail) | ||
|
|
||
| case "--" :: tail => | ||
| if (inSparkOpts) { | ||
| SparkSubmit.printErrorAndExit( | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Couldn't you just omit this and just delegate to the existing check in
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Fair, thanks. |
||
| "Application option separator \"--\" must be after the primary resource " + | ||
| "(i.e., application jar file or Python file).") | ||
| } else { | ||
| childArgs ++= tail.filter(_.nonEmpty) | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I wouldn't filter. If a user has gone through the trouble of quoting things to explicitly include an empty argument, it should probably stay there.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This behavior is actually inherited from the current master code. But I agree with you, since |
||
| } | ||
|
|
||
| case value :: tail => | ||
| if (inSparkOpts) { | ||
| value match { | ||
|
|
@@ -377,8 +386,11 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) { | |
| | | ||
| | --executor-memory MEM Memory per executor (e.g. 1000M, 2G) (Default: 1G). | ||
| | | ||
| | --help, -h Show this help message and exit | ||
| | --verbose, -v Print additional debug output | ||
| | --help, -h Show this help message and exit. | ||
| | --verbose, -v Print additional debug output. | ||
| | | ||
| | -- A "--" signals the end of spark-submit options, all command | ||
| | line arguments after "--" are passed to the application. | ||
| | | ||
| | Spark standalone with cluster deploy mode only: | ||
| | --driver-cores NUM Cores for driver (Default: 1). | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -26,11 +26,16 @@ set -o posix | |
| # Figure out where Spark is installed | ||
| FWDIR="$(cd `dirname $0`/..; pwd)" | ||
|
|
||
| if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then | ||
| echo "Usage: ./sbin/start-thriftserver [options]" | ||
| $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2 | ||
| CLASS="org.apache.spark.sql.hive.thriftserver.HiveThriftServer2" | ||
|
|
||
| if [[ "$@" = --help ]] || [[ "$@" = -h ]]; then | ||
| echo "Usage: ./sbin/start-thriftserver.sh [options] [--] [thrift server options]" | ||
| exec "$FWDIR"/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2 | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Same deal with |
||
| echo | ||
| echo "Thrift server options:" | ||
| exec "$FWDIR"/bin/spark-submit spark-internal --class $CLASS -- -H 2>&1 | tail -n +3 | ||
| echo | ||
| exit 0 | ||
| fi | ||
|
|
||
| CLASS="org.apache.spark.sql.hive.thriftserver.HiveThriftServer2" | ||
| exec "$FWDIR"/bin/spark-submit --class $CLASS spark-internal $@ | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These changes seem unrelated. Is there a bug you can mention? Otherwise, could you call them out explicitly in the PR description?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My bad, you're right. I should update the PR description.
While working on #1620, the
beelinescript had once been implemented withspark-submitto avoid duplicatedjavacheck and computing classpath, but then reverted because of the issue this PR is trying to fix (beeline --helpshowsspark-submitusage message).And while working on this PR, I realized that
beelineis only a JDBC client, and unrelated to Spark, I can just start it withspark-class. That's the reason why this change appear here.