-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transformation matrix is shared among the set of capsule types #21
Comments
Hello @cheneeheng , The transformation matrices are shared across child capsule types but are unique for parent capsule types. Therefore each incoming capsule type will have the same set (number of parent capsule types) of transformations applied to it, whereas in the original paper no such sharing was described... from the original Sabour et al. paper "In convolutional capsule layers, each capsule outputs a local grid of vectors to each type of capsule in the layer above using different transformation matrices for each member of the grid as well as for each type of capsule". I hope this clears things up and thank you for the kind words. |
When I compare your implementation of the transformation matrices: Sabour has also added the |
I have a question regarding this part :
SegCaps/capsule_layers.py
Lines 128 to 139 in c6b3f9e
So you reshaped the 5D input tensor into 4D and used a normal conv2d to perform the dimension transformation.
Here during reshape, you merged the "batch_size" and "input_num_capsule" together. But doing so means that the same conv weights are used for each input capsule type.
However in the paper you mentioned in 3.1 contirbution 2.(ii) that the transformation matrices are shared within a capsule type. Do correct me if i am wrong.
Thanks and good work btw.
Chen.
The text was updated successfully, but these errors were encountered: