-
Notifications
You must be signed in to change notification settings - Fork 70
Remove redundant Set operation from broadcast when no broadcasting occurs #5507
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -667,6 +667,50 @@ def fusion_func_3(fd: FusionDefinition): | |||||
| ) | ||||||
| self.assertEqual(eager_out, nvf_out[0]) | ||||||
|
|
||||||
| def test_broadcast_in_dim_no_redundant_set(self): | ||||||
| """ | ||||||
| Test that broadcast_in_dim doesn't introduce redundant Set operations | ||||||
| when all input dimensions are in broadcast_dims (i.e., no actual broadcast). | ||||||
|
|
||||||
| This verifies the fix for the issue where broadcast_in_dim would create | ||||||
| a redundant float-to-float cast operation via Set when the input already | ||||||
| had the correct shape. | ||||||
| """ | ||||||
| inputs = [ | ||||||
| torch.ones(1, 4, device="cuda"), | ||||||
| torch.randn(2, 4, device="cuda"), | ||||||
| ] | ||||||
|
|
||||||
| def fusion_with_broadcast_in_dim(fd: FusionDefinition): | ||||||
| t0 = fd.define_tensor(shape=[1, -1], contiguity=[None, True]) | ||||||
| t1 = fd.define_tensor(shape=[-1, -1], contiguity=[True, True]) | ||||||
| # broadcast_in_dim with broadcast_dims=[0, 1] means no new dims are added | ||||||
|
||||||
| # broadcast_in_dim with broadcast_dims=[0, 1] means no new dims are added | |
| # broadcast_in_dim with all input dims in broadcast_dims means no broadcasting operation occurs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The review comment is not correct, broadcasting operation occurs and it involves expanding 1-sized dimension.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These checks (and inputs) are probably not necessary since the point of the test is just that the IR should match exactly whether we use broadcast_in_dim or expand whenever there is no new broadcast.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The change seems reasonable to me.