You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The sequence of operations need to be reversed on line 256 of network.py. According to the paper, "Downsampling is performed via average pooling. Then, after applying ReLU activation function to the output tensor, we perform sum-pooling over spatial dimensions."
However, according to the code, sum-pooling is first applied followed by ReLU.
The text was updated successfully, but these errors were encountered:
@RohitSaha Also notice in paper Sec.3.4, "To obtain the vectorized outputs in both networks, we perform global sum pooling over spatial dimensions followed by ReLU." seems the two statements in paper conflict.
For me, another question is, is in the code, the pooling is implemented as follow: self.pooling = nn.AdaptiveMaxPool2d((1, 1)) , is this the same as "sum-pooling" according to the paper?
The sequence of operations need to be reversed on line 256 of network.py. According to the paper, "Downsampling is performed via average pooling. Then, after applying ReLU activation function to the output tensor, we perform sum-pooling over spatial dimensions."
However, according to the code, sum-pooling is first applied followed by ReLU.
The text was updated successfully, but these errors were encountered: