Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I think there might be some problem in your padding_same_conv.py #1

Open
Gasoonjia opened this issue Jun 22, 2018 · 1 comment
Open

Comments

@Gasoonjia
Copy link

Gasoonjia commented Jun 22, 2018

Your padding_same_conv.py is a brilliant implementation for pytorch users needing padding=same. It really helps me and I cannot express anything but gratitude for your wonderful work.

However, I think there might be some mistakes in your implementation, especially in your conv2d_same_padding function.
You only calculate the number of elements padded for the col of the input, and just copy it to the row, which might be wrong when stride[0] != stride [1] and dilation[0] != dilation[1].
And the type of your initial value of stride and dilation might be list according to your following code but not the int you set.

@xigua314
Copy link

something wrong in your code!
x = t.randn(2, 128, 100, 1)
nc2 = Conv2d(128, 128, (2, 1)) #your code
nc4 = nn.Conv2d(128, 128, (2, 1))
nc2_re = nc2(x)
nc4_re = nc4(x)
print nc2_re.size() #(2, 128, 100, 2)
print nc4_re.size() # (2, 128, 99, 1)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants