Improving tv_tensors wrap API for extensibility.#9398
Improving tv_tensors wrap API for extensibility.#9398gabrielfruet wants to merge 16 commits intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/vision/9398
Note: Links to docs will display an error until the docs builds have been completed. This comment was automatically generated by Dr. CI and updates every 15 minutes. |
b23ac61 to
5b0311d
Compare
There was a problem hiding this comment.
@gabrielfruet Thanks a lot for the PR. I left two comments.
it seems that there are some test failures which need to be addressed. If you are still working on this PR, feel free to convert it to a draft. When the PR is ready to be reviewed, just convert it back.
|
@zy1git I think the PR is done. CI is red, but the errors are not related to my changes to my understanding. |
zy1git
left a comment
There was a problem hiding this comment.
@gabrielfruet Thanks a lot for this PR! I reviewed and left some comments. Feel free to take a look and address them.
Co-authored-by: zy1git <zycoding1@gmail.com>
|
@zy1git Thanks for the comments. I was able to fix all. |
Addresses #9333
Adopt a method-based wrapping, enabling user to extend functionality on subclasses of
TVTensor.This is the pythonic approach, since we have many built-ins that rely on this on pattern (e.g
len,iter,next...)This does not break backwards compatibility.