-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix "same" padding torch issue #20270
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #20270 +/- ##
==========================================
+ Coverage 66.34% 75.78% +9.44%
==========================================
Files 508 527 +19
Lines 48264 60532 +12268
Branches 8882 12130 +3248
==========================================
+ Hits 32020 45874 +13854
+ Misses 14425 12609 -1816
- Partials 1819 2049 +230
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Thanks, Sachin. It looks like relevant torch tests are failing, can you take a look? https://github.com/keras-team/keras/actions/runs/10931265787/job/30345878233?pr=20270 |
@sachinprasadhs can you please take a look at the torch test failure? |
This fixes the issue for all torch layers when used with padding="same" and data_format="channels_first".
This was not caught in any of the test case due to the reason that it only happens with the above condition and when the padding_size is obtained from
_compute_padding_length
is something like((0,0), (1, 1))
Fixes: #20235