-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Work items: Contributions Welcome #4423
Comments
Another work-item: we need to improve the separation between interface and implementation in ONNX, which will improve agility when we need to change implementation aspects of ONNX. Currently, it is often unclear if changing something will break some external user because there is no clear distinction between interface files and implementation files, and too many details are exposed in include-files (which may or may not be external interfaces). |
Also welcome to contribute these To do items in ONNX triaged work items which are not assigned. |
### Description These changes have been made to support the GELU operator as a function op. ### Motivation and Context Support for [GELU: Gaussian Error Linear Unit](https://paperswithcode.com/method/gelu) activation function, which was requested in #4933. #4423 also mentions this under the new ops section of `Contributions Welcome`. As per the discussion in #4933, I have added GELU as a context-dependent function-op, that uses the attribute `approximate` to return one of the two possible function-body definitions. The first function definition is the regular GELU: `GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))` The second is the fast approximation based on `tanh`: `GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))` This implementation uses the [PyTorch docs for GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU) as a reference. PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator implementation of `mish` right next to its doc string. --------- Signed-off-by: pranshupant <pranshupant@gmail.com> Co-authored-by: G. Ramalingam <grama@microsoft.com>
### Description These changes have been made to support the GELU operator as a function op. ### Motivation and Context Support for [GELU: Gaussian Error Linear Unit](https://paperswithcode.com/method/gelu) activation function, which was requested in onnx#4933. onnx#4423 also mentions this under the new ops section of `Contributions Welcome`. As per the discussion in onnx#4933, I have added GELU as a context-dependent function-op, that uses the attribute `approximate` to return one of the two possible function-body definitions. The first function definition is the regular GELU: `GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))` The second is the fast approximation based on `tanh`: `GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))` This implementation uses the [PyTorch docs for GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU) as a reference. PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator implementation of `mish` right next to its doc string. --------- Signed-off-by: pranshupant <pranshupant@gmail.com> Co-authored-by: G. Ramalingam <grama@microsoft.com> Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
### Description These changes have been made to support the GELU operator as a function op. ### Motivation and Context Support for [GELU: Gaussian Error Linear Unit](https://paperswithcode.com/method/gelu) activation function, which was requested in onnx#4933. onnx#4423 also mentions this under the new ops section of `Contributions Welcome`. As per the discussion in onnx#4933, I have added GELU as a context-dependent function-op, that uses the attribute `approximate` to return one of the two possible function-body definitions. The first function definition is the regular GELU: `GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))` The second is the fast approximation based on `tanh`: `GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))` This implementation uses the [PyTorch docs for GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU) as a reference. PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator implementation of `mish` right next to its doc string. --------- Signed-off-by: pranshupant <pranshupant@gmail.com> Co-authored-by: G. Ramalingam <grama@microsoft.com> Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
### Description These changes have been made to support the GELU operator as a function op. ### Motivation and Context Support for [GELU: Gaussian Error Linear Unit](https://paperswithcode.com/method/gelu) activation function, which was requested in onnx#4933. onnx#4423 also mentions this under the new ops section of `Contributions Welcome`. As per the discussion in onnx#4933, I have added GELU as a context-dependent function-op, that uses the attribute `approximate` to return one of the two possible function-body definitions. The first function definition is the regular GELU: `GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))` The second is the fast approximation based on `tanh`: `GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))` This implementation uses the [PyTorch docs for GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU) as a reference. PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator implementation of `mish` right next to its doc string. --------- Signed-off-by: pranshupant <pranshupant@gmail.com> Co-authored-by: G. Ramalingam <grama@microsoft.com> Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
### Description These changes have been made to support the GELU operator as a function op. ### Motivation and Context Support for [GELU: Gaussian Error Linear Unit](https://paperswithcode.com/method/gelu) activation function, which was requested in onnx#4933. onnx#4423 also mentions this under the new ops section of `Contributions Welcome`. As per the discussion in onnx#4933, I have added GELU as a context-dependent function-op, that uses the attribute `approximate` to return one of the two possible function-body definitions. The first function definition is the regular GELU: `GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))` The second is the fast approximation based on `tanh`: `GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))` This implementation uses the [PyTorch docs for GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU) as a reference. PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator implementation of `mish` right next to its doc string. --------- Signed-off-by: pranshupant <pranshupant@gmail.com> Co-authored-by: G. Ramalingam <grama@microsoft.com> Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
### Description These changes have been made to support the GELU operator as a function op. ### Motivation and Context Support for [GELU: Gaussian Error Linear Unit](https://paperswithcode.com/method/gelu) activation function, which was requested in onnx#4933. onnx#4423 also mentions this under the new ops section of `Contributions Welcome`. As per the discussion in onnx#4933, I have added GELU as a context-dependent function-op, that uses the attribute `approximate` to return one of the two possible function-body definitions. The first function definition is the regular GELU: `GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))` The second is the fast approximation based on `tanh`: `GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))` This implementation uses the [PyTorch docs for GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU) as a reference. PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator implementation of `mish` right next to its doc string. --------- Signed-off-by: pranshupant <pranshupant@gmail.com> Co-authored-by: G. Ramalingam <grama@microsoft.com> Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
### Description These changes have been made to support the GELU operator as a function op. ### Motivation and Context Support for [GELU: Gaussian Error Linear Unit](https://paperswithcode.com/method/gelu) activation function, which was requested in onnx#4933. onnx#4423 also mentions this under the new ops section of `Contributions Welcome`. As per the discussion in onnx#4933, I have added GELU as a context-dependent function-op, that uses the attribute `approximate` to return one of the two possible function-body definitions. The first function definition is the regular GELU: `GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))` The second is the fast approximation based on `tanh`: `GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))` This implementation uses the [PyTorch docs for GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU) as a reference. PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator implementation of `mish` right next to its doc string. --------- Signed-off-by: pranshupant <pranshupant@gmail.com> Co-authored-by: G. Ramalingam <grama@microsoft.com> Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
Note
New: use this query to find all issues with the “contributions welcome” tag
Creating this pinned issue to track work items, especially those where contributions would be welcome.
Features:
Improving function-definition infrastructure: Issue 3139
Reference implementation of ops: see Issue 4432Add support for model-local functions in version-converter. (Issue 4370)
An ONNX sanitizer tool: Issue 4476
Shape inference testing infrastructure: please see Issue 4160
Cleanup:
Minor:
Handle Constant op variants in shape-inference: see Issue 3985 and PR 4194Documentation:
New Ops:
The text was updated successfully, but these errors were encountered: