Replies: 1 comment
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I do realtime learning with libtorch where the gradient is calculated elsewhere and would like to translate that to executorch: https://github.com/berndporr/dnf_torch this is realtime noise cancellations but I have also other applications where the gradient comes further afield so really not associated with a simple subtraction operation as I also do RL. In libtorch I can directly inject the gradient and then do the backwards pass: https://github.com/berndporr/dnf_torch/blob/main/dnf_torch.cpp#L99
However, in executorch forward/backward is in one go. How do I go about this?
Looking at the XOR example I could inject the gradient as an additional input? Roughly:
and then lower that and inject the gradient as the first input. Would that be the way to go?
Beta Was this translation helpful? Give feedback.
All reactions