Skip to content

[Bugfix] Use .requires_grad instead of .require_grad in embedding module #743

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

anasashb
Copy link

This pull request fixes a bug / typo in layers/Embed.py on lines 13 and 50.

Torch tensors do not have an attribute called require_grad, and therefore pe.require_grad = False and w.require_grad = False calls were resulting in the insertion of an additional require_grad attribute to the instantiated tensor, that did nothing.

The correct attribute to track gradients is Torch is called requires_grad, and in this PR both lines (13 and 50) are changed to reflect this.

The existing bug / typo is not expected to have affected any code behavior, as when a tensor is created using torch.zeros, it by default comes with requires_grad attribute set to False anyways. But for the sake of clarity and preventing any future confusion it is worthwhile to fix the bug anyways.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant