Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about Lang attn in RLA module? #19

Open
nero1342 opened this issue Nov 27, 2023 · 0 comments
Open

Question about Lang attn in RLA module? #19

nero1342 opened this issue Nov 27, 2023 · 0 comments

Comments

@nero1342
Copy link

Hi,
I have a question related to RLA module.

`

  lang_feat_att = self.lang_proj(lang_feat_att)
  lang_feat_att = self.RLA_lang_att(output, lang_feat_att.permute(1,0,2)) * F.sigmoid(self.lang_weight)
  output = output + lang_feat_att * self.rla_weight

`

It seems that RLA_lang_att does not contribute so much. I have tried to remove these lines of code and the result kept the same.
Moreover, with self.rla_weight=0.1 and only used for the first layer, the lang_feat_att may not affect to the output. However, in the paper, I saw that it improves ~1% in performance. Is there any mistake or I understood in a wrong way?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant