-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-41246][core] Solve the problem of RddId negative #38781
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 2 commits
4e5a347
0a7d560
9f97604
9e96549
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -2570,10 +2570,23 @@ class SparkContext(config: SparkConf) extends Logging { | |
|
|
||
| private[spark] def newShuffleId(): Int = nextShuffleId.getAndIncrement() | ||
|
|
||
| private val nextRddId = new AtomicInteger(0) | ||
| private var nextRddId = new AtomicInteger(0) | ||
|
|
||
| /** Register a new RDD, returning its RDD ID */ | ||
| private[spark] def newRddId(): Int = nextRddId.getAndIncrement() | ||
| private[spark] def newRddId(): Int = { | ||
| var id = nextRddId.getAndIncrement() | ||
| if (id >= 0) { | ||
| return id | ||
| } | ||
| this.synchronized { | ||
| id = nextRddId.getAndIncrement() | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we avoid the duplicate call of 'nextRddId.getAndIncrement()'? |
||
| if (id < 0) { | ||
|
||
| nextRddId = new AtomicInteger(0) | ||
| id = nextRddId.getAndIncrement() | ||
|
||
| } | ||
| } | ||
| id | ||
| } | ||
|
|
||
| /** | ||
| * Registers listeners specified in spark.extraListeners, then starts the listener bus. | ||
|
|
||
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens when the RDD ID overflows within a SparkContext? Are there tests that cover these cases?
Curious if it would be better to switch to an AtomicLong and just modulo max int?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens when the RDD ID overflows within a SparkContext?
When BlockManager generates BlockId, BlockId only supports positive rddid, so BlockId generation fails.
switch to an AtomicLong ?
The scope of influence is very broad, which may require extensive discussion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You dont need to make this a
var- see below.