-
Notifications
You must be signed in to change notification settings - Fork 4
Add ndarray.dot #72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ndarray.dot #72
Changes from 3 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -362,6 +362,7 @@ def reshape(self, *shape, order="C"): | |
|
||
diagonal = _funcs.diagonal | ||
trace = _funcs.trace | ||
dot = _funcs.dot | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. missing also There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Missing in numpy, yes :-) In [23]: hasattr(np.array([1, 2, 3]), 'vdot') There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. lol. Let's still add it to be forward-looking. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. OK, can do.
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If we can do it "for free" it may be a fine thing to do? WDYT @rgommers? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There is no guiding principle, it's an accident of history. It's highly unlikely though that we'll add more method to |
||
|
||
### sorting ### | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
conditional cast.
Also,
torch.matmul
is pretty much an alias fornp.dot
(and I believenp.matmul
so you should just need to implement one of them.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Devil's in the details (edge cases). I'll see if implementations can be merged, after all wrinkles are ironed out. For now there are xfails still.
From https://numpy.org/doc/stable/reference/generated/numpy.matmul.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wow, ok, then
torch.matmul
isnp.dot
. You'll need to implementnp.matmul
implementing that weird broadcasting behaviour by hand.Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes (sigh). Am going to postpone this a bit in favor of gh-70. Once that stabilizes, will turn back to matmul.
BTW, this 'signature' thing is gufuncs, I wonder if pytorch has an equivalent or we're facing the need tp mirror the full gufunc machinery.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't quite understand what you mean by "this signature thing", but I happen to know that PyTorch doesn't have generalised ufuncs, so we'll need to replicate that machinery at some point.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"This signature thing":
also referred to as that weird broadcasting a couple of messages above :-).
So yes, am going to postpone this to some point in the future.