Skip to content

Support output_padding in XNNPACK transposed convolution (#18185)#18185

Merged
GregoryComer merged 1 commit intopytorch:mainfrom
GregoryComer:export-D96603677
Mar 16, 2026
Merged

Support output_padding in XNNPACK transposed convolution (#18185)#18185
GregoryComer merged 1 commit intopytorch:mainfrom
GregoryComer:export-D96603677

Conversation

@GregoryComer
Copy link
Member

@GregoryComer GregoryComer commented Mar 14, 2026

Summary:

XNNPACK got support for transposed convolutions with output padding at some point. Wire it up through ET.

Differential Revision: D96603677

@pytorch-bot
Copy link

pytorch-bot bot commented Mar 14, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18185

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 8e61e41 with merge base 5545395 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 14, 2026
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Mar 14, 2026

@GregoryComer has exported this pull request. If you are a Meta employee, you can view the originating Diff in D96603677.

@GregoryComer GregoryComer marked this pull request as draft March 14, 2026 21:34
@GregoryComer GregoryComer added module: xnnpack Issues related to xnnpack delegation and the code under backends/xnnpack/ release notes: xnnpack Changes to the XNNPack backend delegate labels Mar 14, 2026
@meta-codesync meta-codesync bot changed the title Support output_padding in XNNPACK transposed convolution Support output_padding in XNNPACK transposed convolution (#18185) Mar 15, 2026
GregoryComer added a commit to GregoryComer/executorch that referenced this pull request Mar 15, 2026
Summary:

XNNPACK got support for transposed convolutions with output padding at some point. Wire it up through ET.

Differential Revision: D96603677
Summary:

XNNPACK got support for transposed convolutions with output padding at some point. Wire it up through ET.

Differential Revision: D96603677
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Mar 15, 2026

@GregoryComer has imported this pull request. If you are a Meta employee, you can view this in D96603677.

@GregoryComer GregoryComer marked this pull request as ready for review March 15, 2026 21:12
Copy link
Contributor

@SS-JIA SS-JIA left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes LGTM!

@GregoryComer GregoryComer merged commit 69ae8c5 into pytorch:main Mar 16, 2026
162 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported module: xnnpack Issues related to xnnpack delegation and the code under backends/xnnpack/ release notes: xnnpack Changes to the XNNPack backend delegate

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants