Rename autodiff L_op and R_op methods to pull_back and push_forward#1983
Rename autodiff L_op and R_op methods to pull_back and push_forward#1983ricardoV94 wants to merge 4 commits intopymc-devs:v3from
Conversation
680c62d to
dfacf05
Compare
08e6493 to
a7b636a
Compare
|
Did you ever see this one @jessegrabowski |
|
No, but looks like an overly strict check somewhere? Like its doing |
|
Maybe a new version of scipy thing? |
|
I haven't seen it fail in other CI. Does it have random numbers in it? |
a7200ac to
2409f4a
Compare
|
You are sometimes adding/renaming a fourth argument with different names: |
If this sounds good to go otherwise then I would merge and gladly take your cleanup after |
|
I'm 0/2 tonight on commenting before I read the code |
| def all_equal(x, y): | ||
| return ( | ||
| x.shape == y.shape | ||
| and len(x.data) == len(y.data) |
There was a problem hiding this comment.
split this to a separate PR?
pytensor/graph/op.py
Outdated
|
|
||
| Parameters | ||
| ---------- | ||
| inputs |
pytensor/graph/op.py
Outdated
|
|
||
| Returns | ||
| ------- | ||
| list of Variable |
There was a problem hiding this comment.
nit: always name the return variable in the docstring even if the variable is never actually created
pytensor/graph/op.py
Outdated
|
|
||
| References | ||
| ---------- | ||
| .. [1] Giles, Mike. 2008. "An Extended Collection of Matrix Derivative |
There was a problem hiding this comment.
Is this really the best reference for explaining what is a VJP? This is a autodiff cookbook paper
| ------- | ||
| list of Variable | ||
| The tangent vectors w.r.t. each output. One `Variable` per output. | ||
|
|
There was a problem hiding this comment.
poor pushforward doesn't even get a reference :(
| `R_op` method, `push_forward` must never use ``None`` to indicate | ||
| disconnected outputs. | ||
|
|
||
| Parameters |
There was a problem hiding this comment.
same type/naming complaints
| try: | ||
| y = x.type.filter_variable(y) | ||
| except TypeError: | ||
| # This is a hack |
There was a problem hiding this comment.
are we blessing this hack now?
| ) | ||
| warnings.warn( | ||
| "grad_overrides is deprecated in favor of lop_overrides. Using it will lead to an error in the future.", | ||
| "grad_overrides is deprecated in favor of pull_back_overrides.", |
There was a problem hiding this comment.
This warning has already been here for a while right? We could just remove this one.
pytensor/compile/builders.py
Outdated
| self._lop_op_cache[disconnected_output_grads] = lop_overrides | ||
| lop_overrides.kwargs["on_unused_input"] = "ignore" | ||
| return lop_overrides | ||
| if isinstance(pb_overrides, OpFromGraph): |
There was a problem hiding this comment.
nit: i don't love the use of pb/pf in this PR. I'd prefer we just write the words out to be as clear as possible. This code is conceptual enough as it is, and people have wide screens.
2409f4a to
4b4c419
Compare
Closes #182
L_op -> pull_back
R_op -> push_forward
push_forward: takes outputs as well (consistent signature with L_op)
push_forward: Uses DisconnectType instead of None (used to be the same in L_op, but they changed after a while)
Claude generated