Skip to content
Closed
Show file tree
Hide file tree
Changes from 8 commits
Commits
Show all changes
17 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -521,7 +521,9 @@ object SparkSubmit {
sysProps.put("spark.yarn.isPython", "true")
}
if (args.principal != null) {
require(args.keytab != null, "Keytab must be specified when the keytab is specified")
require(args.keytab != null, "Keytab must be specified when principal is specified")
sysProps.put("spark.yarn.keytab", args.keytab)
sysProps.put("spark.yarn.principal", args.principal)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@harishreedharan I see you changed this part of code last time. If we want to pass these two arguments to Spark SQL, what is the recommended way?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@harishreedharan Hi Hari, could you let us know the preferred way to pass principal and keytab parameters from spark submit to spark sql? waiting for your response to proceed. Thank you!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might want to look at yarn Client.scala setupCredentials() since its doing something pretty similar it looks like.

UserGroupInformation.loginUserFromKeytab(args.principal, args.keytab)
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@ import java.io.{File, PrintStream}
import java.util.{Map => JMap}
import javax.annotation.concurrent.GuardedBy

import org.apache.hadoop.security.UserGroupInformation
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's move this import down to the place where we have other hadoop related imports. https://cwiki.apache.org/confluence/display/SPARK/Spark+Code+Style+Guide#SparkCodeStyleGuide-Imports is the doc about import ordering.


import scala.collection.JavaConverters._
import scala.language.reflectiveCalls

Expand All @@ -35,7 +37,7 @@ import org.apache.hadoop.hive.ql.{Driver, metadata}
import org.apache.hadoop.hive.shims.{HadoopShims, ShimLoader}
import org.apache.hadoop.util.VersionInfo

import org.apache.spark.Logging
import org.apache.spark.{SparkConf, Logging}
import org.apache.spark.sql.catalyst.expressions.Expression
import org.apache.spark.sql.execution.QueryExecutionException
import org.apache.spark.util.{CircularBuffer, Utils}
Expand Down Expand Up @@ -150,6 +152,14 @@ private[hive] class ClientWrapper(
val original = Thread.currentThread().getContextClassLoader
// Switch to the initClassLoader.
Thread.currentThread().setContextClassLoader(initClassLoader)

val sparkConf = new SparkConf
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of creating a new Spark Conf, can we use SparkEnv.get.conf to get the spark conf associated with the current spark context?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yhuai Sorry for the late response. I did the testing after changing to SparkEnv.get.conf but it didn't work. The reason is that Yarn Client.scala resets property spark.yarn.keytab by appending some random strings to keytab file name during setupCredentials, which will be used as the link name in distributed cache. I think the one for the link name should be actually separated from the original keytab setting, e.g. using different property names.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yolandagao Thanks for the explanation. Can you add comments to your code (including why we need to put those confs to sysProps and why we need to create a new SparkConf at here)? Basically, we need to document the flow of how these confs get propagated. Otherwise, it is not obvious why we need to do this change. Thanks!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yhuai Sure. I should have done this earlier, to make everything clearer:) Added some comments there, and please help review. Thank you!

if (sparkConf.contains("spark.yarn.principal") && sparkConf.contains("spark.yarn.keytab")) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make it clear that we set these two settings in SparkSubmit.

UserGroupInformation.loginUserFromKeytab(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

before calling this, actually verify that the keytab file exists and fail with a message including the property name, to help people debug the problem. UGI internal exceptions are rarely informative enough

sparkConf.get("spark.yarn.principal"),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, you should call SparkHadoopUtil.get.loginUserFromKeytab(principalName, keytabFilename; maybe the check could be added there.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@steveloughran Good point. Better to check the existence of the keytab file before make the login call, as if the keytab doesn't exist the UGI call will definitely fail but with some indirect message like "login failed... no keys found..." ect. Added the check.

However, calling SparkHadoopUtil.get.loginUserFromKeytab instead of UserGroupInformation.loginUserFromKeytab in ClientWrapper will not solve the problem as SparkHadoopUtil is shared and the UserGroupInformation class it includes is not the same one used by SessionState.start in ClientWrapper. Therefore, the program still fails with no tgt exception when connecting to metastore. Also not able to replace the UGI call in SparkSubmit either, as incorrect type of SparkHadoopUtil instance might get created due to yarn mode isn't set in the system until it flows to yarn Client.scala.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. just include the check (actually, UGI itself should do that check shouldn't it? Lazy)

sparkConf.get("spark.yarn.keytab"))
}

val ret = try {
val initialConf = new HiveConf(classOf[SessionState])
// HiveConf is a Hadoop Configuration, which has a field of classLoader and
Expand Down