Skip to content
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -34,26 +34,25 @@ private[history] class HistoryServerArguments(conf: SparkConf, args: Array[Strin

@tailrec
private def parse(args: List[String]): Unit = {
if (args.length == 1) {
setLogDirectory(args.head)
} else {
args match {
case ("--dir" | "-d") :: value :: tail =>
setLogDirectory(value)
parse(tail)

case ("--help" | "-h") :: tail =>
printUsageAndExit(0)

case ("--properties-file") :: value :: tail =>
propertiesFile = value
parse(tail)

case Nil =>

case _ =>
printUsageAndExit(1)
}
args match {
case ("--dir" | "-d") :: value :: tail =>
setLogDirectory(value)
parse(tail)

case ("--help" | "-h") :: tail =>
printUsageAndExit(0)

case ("--properties-file") :: value :: tail =>
propertiesFile = value
parse(tail)

case dir :: Nil =>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IIUC this is not related to the PR description?

Copy link
Member Author

@gengliangwang gengliangwang Oct 11, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it is related. For the last single argument, it is treated as the event log directory.
See the deleted code

if (args.length == 1) {
  setLogDirectory(args.head)
}

I prefer to keep the behavior.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sounds good.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. I have updated the description.

setLogDirectory(value = dir)

case Nil =>

case _ =>
printUsageAndExit(1)
}
}

Expand Down
17 changes: 16 additions & 1 deletion sbin/start-history-server.sh
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,22 @@ if [ -z "${SPARK_HOME}" ]; then
export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
fi

# NOTE: This exact class name is matched downstream by SparkSubmit.
# Any changes need to be reflected there.
CLASS="org.apache.spark.deploy.history.HistoryServer"

if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
echo "Usage: ./sbin/start-history-server.sh [options]"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: why not have a separated usage() function?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well this is short, and I am following what start-master.sh and start-slave.sh did.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, I also saw similar code in start-thriftserver.sh, it uses usage(). Both are fine to me, just head up to make sure we've taken that into consideration.

pattern="Usage:"
pattern+="\|Using Spark's default log4j profile:"
pattern+="\|Started daemon with process name"
pattern+="\|Registered signal handler for"

"${SPARK_HOME}"/bin/spark-class $CLASS --help 2>&1 | grep -v "$pattern" 1>&2
exit 1
fi

. "${SPARK_HOME}/sbin/spark-config.sh"
. "${SPARK_HOME}/bin/load-spark-env.sh"

exec "${SPARK_HOME}/sbin"/spark-daemon.sh start org.apache.spark.deploy.history.HistoryServer 1 "$@"
exec "${SPARK_HOME}/sbin"/spark-daemon.sh start $CLASS 1 "$@"