Skip to content
Closed
Show file tree
Hide file tree
Changes from 4 commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
55a1c31
SPARK-1832 added color to tasks on executor page
ajbozarth Dec 2, 2015
d9f2b82
SPARK-12149 Cleaned up color UI code, fixed tests, added color UI for…
ajbozarth Dec 4, 2015
1da5f1a
Addressed comments
ajbozarth Dec 7, 2015
20c7b48
added MiMa exclude
ajbozarth Dec 8, 2015
1652274
Comments followup
ajbozarth Dec 11, 2015
13d97de
Merge branch 'master' into spark12149
ajbozarth Dec 21, 2015
190a033
Switched green and blue and only color completed when either active o…
ajbozarth Dec 21, 2015
31d6a1e
Updated MiMa exclude for new version
ajbozarth Dec 21, 2015
3194f7d
Merge branch 'master' into spark12149
ajbozarth Jan 4, 2016
34435a5
Merge branch 'master' into spark12149
ajbozarth Jan 5, 2016
5e27bf0
Merge branch 'master' into spark12149
ajbozarth Jan 5, 2016
96a3899
Merge branch 'master' into spark12149
ajbozarth Jan 6, 2016
5bcc298
Reverted completed to always colored when greater than zero
ajbozarth Jan 6, 2016
70ab748
Addressed comments - added GC Time and a Task Time tool tip
ajbozarth Jan 13, 2016
4294d3f
Replaced totalCores with maxTasks to include spark.task.cpus conf use…
ajbozarth Jan 14, 2016
cfb3589
Merge branch 'master' into spark12149
ajbozarth Jan 14, 2016
07ba26a
Fixed style issue
ajbozarth Jan 15, 2016
b7a1c09
Merge branch 'master' into spark12149
ajbozarth Jan 15, 2016
bbe1133
Merge branch 'master' into spark12149
ajbozarth Jan 15, 2016
05f957a
Added color and GC time to new totals table
ajbozarth Jan 16, 2016
2f54a33
Removed Completed Tasks coloring
ajbozarth Jan 19, 2016
9293c41
Fixed some style issues
ajbozarth Jan 25, 2016
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions core/src/main/scala/org/apache/spark/status/api/v1/api.scala
Original file line number Diff line number Diff line change
Expand Up @@ -55,11 +55,13 @@ class ExecutorSummary private[spark](
val rddBlocks: Int,
val memoryUsed: Long,
val diskUsed: Long,
val totalCores: Int,
val activeTasks: Int,
val failedTasks: Int,
val completedTasks: Int,
val totalTasks: Int,
val totalDuration: Long,
val totalGCTime: Long,
val totalInputBytes: Long,
val totalShuffleRead: Long,
val totalShuffleWrite: Long,
Expand Down
65 changes: 61 additions & 4 deletions core/src/main/scala/org/apache/spark/ui/exec/ExecutorsPage.scala
Original file line number Diff line number Diff line change
Expand Up @@ -33,11 +33,13 @@ private[ui] case class ExecutorSummaryInfo(
rddBlocks: Int,
memoryUsed: Long,
diskUsed: Long,
totalCores: Int,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the comment for this case class says it isn't used anymore - do we really need to update it?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did some checks, MiMa exclude is still needed even without this change and this change did not affect any code, therefore I've removed the change.

activeTasks: Int,
failedTasks: Int,
completedTasks: Int,
totalTasks: Int,
totalDuration: Long,
totalGCTime: Long,
totalInputBytes: Long,
totalShuffleRead: Long,
totalShuffleWrite: Long,
Expand Down Expand Up @@ -117,6 +119,32 @@ private[ui] class ExecutorsPage(
val maximumMemory = info.maxMemory
val memoryUsed = info.memoryUsed
val diskUsed = info.diskUsed

// Determine Color Opacity from 0.5-1
// activeTasks range from 0 to all cores
val activeTasksAlpha =
if (info.totalCores > 0) {
(info.activeTasks.toDouble / info.totalCores) * 0.5 + 0.5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this isn't taking into account that a task could use more then one core via the spark.task.cpus configuration.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't know about that, currently testing a change that replaces totalCores with maxTasks (= totalCores / spark.task.cpus) in my implementation. We also have the option of not shading the color on active tasks at all.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just updated it to look at spark.task.cpus

} else {
1
}
// failedTasks range max at 10% failure, alpha max = 1
// completedTasks range ignores 90% of tasks
val (failedTasksAlpha, completedTasksAlpha) =
if (info.totalTasks > 0) {
(math.min(10 * info.failedTasks.toDouble / info.totalTasks, 1) * 0.5 + 0.5,
math.max(((10 * info.completedTasks.toDouble / info.totalTasks) - 9) * 0.5 + 0.5, 0.5))
} else {
(1, 1)
}
// totalDuration range from 0 to 50% GC time, alpha max = 1
val totalDurationAlpha =
if (info.totalDuration > 0) {
math.min(info.totalGCTime.toDouble / info.totalDuration + 0.5, 1)
} else {
1
}

<tr>
<td>{info.id}</td>
<td>{info.hostPort}</td>
Expand All @@ -128,11 +156,36 @@ private[ui] class ExecutorsPage(
<td sorttable_customkey={diskUsed.toString}>
{Utils.bytesToString(diskUsed)}
</td>
<td>{info.activeTasks}</td>
<td>{info.failedTasks}</td>
<td>{info.completedTasks}</td>
<td style={
if (info.activeTasks > 0) {
"background:hsla(120, 100%, 25%, " + activeTasksAlpha + ");color:white"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can the style="..." text go inside the conditional so we don't end up with style="" or does that not work in this syntax?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are correct, it does not work in this syntax, I tired.

} else {
""
}
}>{info.activeTasks}</td>
<td style={
if (info.failedTasks > 0) {
"background:hsla(0, 100%, 50%, " + failedTasksAlpha + ");color:white"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another nit but if alpha is in 0.0 - 0.1 would it be clearer to write "1.0" instead of "100%" for HSL?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is actually how HSLa is designed and was it seemed to be an area on contention when it was first defined.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh so you have to express some as % and some as a fraction? that seems odd. If consistency is at all possible that'd be clearer I think.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but thats just how HSLa is
http://www.w3.org/TR/css3-color/#hsla-color

} else {
""
}
}>{info.failedTasks}</td>
<td style={
if (info.completedTasks > 0) {
"background:hsla(240, 100%, 50%, " + completedTasksAlpha + ");color:white"
} else {
""
}
}>{info.completedTasks}</td>
<td>{info.totalTasks}</td>
<td sorttable_customkey={info.totalDuration.toString}>
<td sorttable_customkey={info.totalDuration.toString} style={
// Red if GC time over 10%
if (10 * info.totalGCTime > info.totalDuration) {
"background:hsla(0, 100%, 50%, " + totalDurationAlpha + ");color:white"
} else {
""
}
}>
{Utils.msDurationToString(info.totalDuration)}
</td>
<td sorttable_customkey={info.totalInputBytes.toString}>
Expand Down Expand Up @@ -184,11 +237,13 @@ private[spark] object ExecutorsPage {
val memUsed = status.memUsed
val maxMem = status.maxMem
val diskUsed = status.diskUsed
val totalCores = listener.executorToTotalCores.getOrElse(execId, 0)
val activeTasks = listener.executorToTasksActive.getOrElse(execId, 0)
val failedTasks = listener.executorToTasksFailed.getOrElse(execId, 0)
val completedTasks = listener.executorToTasksComplete.getOrElse(execId, 0)
val totalTasks = activeTasks + failedTasks + completedTasks
val totalDuration = listener.executorToDuration.getOrElse(execId, 0L)
val totalGCTime = listener.executorToJvmGCTime.getOrElse(execId, 0L)
val totalInputBytes = listener.executorToInputBytes.getOrElse(execId, 0L)
val totalShuffleRead = listener.executorToShuffleRead.getOrElse(execId, 0L)
val totalShuffleWrite = listener.executorToShuffleWrite.getOrElse(execId, 0L)
Expand All @@ -200,11 +255,13 @@ private[spark] object ExecutorsPage {
rddBlocks,
memUsed,
diskUsed,
totalCores,
activeTasks,
failedTasks,
completedTasks,
totalTasks,
totalDuration,
totalGCTime,
totalInputBytes,
totalShuffleRead,
totalShuffleWrite,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,12 @@ private[ui] class ExecutorsTab(parent: SparkUI) extends SparkUITab(parent, "exec
*/
@DeveloperApi
class ExecutorsListener(storageStatusListener: StorageStatusListener) extends SparkListener {
val executorToTotalCores = HashMap[String, Int]()
val executorToTasksActive = HashMap[String, Int]()
val executorToTasksComplete = HashMap[String, Int]()
val executorToTasksFailed = HashMap[String, Int]()
val executorToDuration = HashMap[String, Long]()
val executorToJvmGCTime = HashMap[String, Long]()
val executorToInputBytes = HashMap[String, Long]()
val executorToInputRecords = HashMap[String, Long]()
val executorToOutputBytes = HashMap[String, Long]()
Expand All @@ -62,6 +64,7 @@ class ExecutorsListener(storageStatusListener: StorageStatusListener) extends Sp
override def onExecutorAdded(executorAdded: SparkListenerExecutorAdded): Unit = synchronized {
val eid = executorAdded.executorId
executorToLogUrls(eid) = executorAdded.executorInfo.logUrlMap
executorToTotalCores(eid) = executorAdded.executorInfo.totalCores
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so this might have issues on other cluster manager other then yarn.. see the discussion at #2980

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I looked into the differences between that case and this and the values are coming from different places. In this case the totalCores value is given as a param to ExecutorInfo which is called directly by the Standalone (local) and Mesos backends.

executorIdToData(eid) = ExecutorUIData(executorAdded.time)
}

Expand Down Expand Up @@ -131,6 +134,7 @@ class ExecutorsListener(storageStatusListener: StorageStatusListener) extends Sp
executorToShuffleWrite(eid) =
executorToShuffleWrite.getOrElse(eid, 0L) + shuffleWrite.shuffleBytesWritten
}
executorToJvmGCTime(eid) = executorToJvmGCTime.getOrElse(eid, 0L) + metrics.jvmGCTime
}
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,13 @@
"rddBlocks" : 8,
"memoryUsed" : 28000128,
"diskUsed" : 0,
"totalCores" : 0,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The expected number of cores is 0?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated that file according to the javadocs for HistoryServerSuite, the test that uses it. I would assume it's 0 since it's the driver and the driver has no cores assigned to run tasks.

"activeTasks" : 0,
"failedTasks" : 1,
"completedTasks" : 31,
"totalTasks" : 32,
"totalDuration" : 8820,
"totalGCTime" : 352,
"totalInputBytes" : 28000288,
"totalShuffleRead" : 0,
"totalShuffleWrite" : 13180,
Expand Down
3 changes: 3 additions & 0 deletions project/MimaExcludes.scala
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,9 @@ object MimaExcludes {
// SPARK-3580 Add getNumPartitions method to JavaRDD
ProblemFilters.exclude[MissingMethodProblem](
"org.apache.spark.api.java.JavaRDDLike.getNumPartitions")
) ++ Seq(
// SPARK-12149 Added new fields to ExecutorSummary
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.status.api.v1.ExecutorSummary.this")
) ++
// SPARK-11314: YARN backend moved to yarn sub-module and MiMA complains even though it's a
// private class.
Expand Down