Skip to content

Not overloading Base functions #19

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
mcabbott opened this issue Nov 13, 2019 · 15 comments
Closed

Not overloading Base functions #19

mcabbott opened this issue Nov 13, 2019 · 15 comments

Comments

@mcabbott
Copy link
Contributor

mcabbott commented Nov 13, 2019

This package currently overloads functions like Base.exp for the allocating version, while defining VML.exp! for the in-place version. Which is piracy, right?

I'd like to suggest not doing this. Other packages use Yeppp.log and AppleAccelerate.exp by default don't. (And it's trivial to re-define the overloads for whichever functions you are actually using, in some script.)

Now might be a good time to fix this, since presumably few people are using this on Julia >= 1.0 right now, the update is #17 in progress.

@aminya
Copy link
Member

aminya commented Nov 13, 2019

I prefer that we merge #17 first so we can develope it easier. PR to a PR isn't the best thing.

@mcabbott
Copy link
Contributor Author

Sure, such a change shouldn't be part of something else. But perhaps it should happen before tagging for 1.0.

@mcabbott
Copy link
Contributor Author

mcabbott commented Nov 13, 2019

Originally posted by @Crown421 in #17 (comment)

It does, when I updated the test suite I ran into some issues verifying that a given call actually belongs to Base or VML, to make sure that I am not comparing a function to itself in terms of accurate output.
Ideally I would like the option that using VML extends Base and import VML doesn't, like one would expect, but unfortunately there seem to be some restrictions in the way Julia itself works here.
I think that is appropriate for its own issue.

I don't think this can be controlled by using vs. import. You can export a VML.exp, but then to use it unqualified relies on not having called Base.exp yet when you say using VML, which seems fragile. Better IMHO not to export this.

You could mess with convenience things. AppleAccelerate has a macro @replacebase which acts on code. You could also have a macro which just adds the method for you, like @overload exp log.

@Crown421
Copy link
Collaborator

Crown421 commented Nov 13, 2019

I don't think this can be controlled by using vs. import. You can export a VML.exp, but then to use it unqualified relies on not having called Base.exp yet when you say using VML, which seems fragile. Better IMHO not to export this.

Yes, precisely, that is what I ran into as well and decided not to mess with it at the moment.

You could mess with convenience things. AppleAccelerate has a macro @replacebase which acts on code. You could also have a macro which just adds the method for you, like @overload exp log.

This seems like a very nice idea, I was thinking about something like this but had no idea how to do it nicely.

@aminya
Copy link
Member

aminya commented Nov 13, 2019

However, I think we are missing this idea that broadcasting is something separate. I agree with having some macro that converts broadcasts to matrices operations, but at the same time, I think we need to have these simple matrix APIs. There is no harm in having functions that accept arrays. Many of such functions are found in the Base.

@Crown421
Copy link
Collaborator

I don't think the idea is to make broadcasts into calls to VML. In fact I would be against it, when using the broadcasting syntax the result should be broadcasting. Otherwise code gets less readable.

I feel it is "julian" to extend base functions, as its explicitly a feature of the language (to my understanding). This is part of why I left it like that, instead defining everything as VMLsin or so. But having a choice is nice, if it can be done nicely.

@mcabbott
Copy link
Contributor Author

Yes I think a macro which goes looking for dots would be too much, not trivial to write, and confusing to use. Here's a first stab at the much simpler @overload idea:

Crown421/VML.jl@master...mcabbott:nopiracy

julia> module VML
       exp(a) = "yes" 
       end;

julia> @overload exp 

julia> exp([1,2,3]) 
"yes"

julia> exp(1)   
2.718281828459045

julia> exp.([1,2,3])
3-element Array{Float64,1}:
  2.718281828459045
  7.38905609893065 
 20.085536923187668

@Crown421
Copy link
Collaborator

Crown421 commented Nov 13, 2019

Looks great. If I get to to it before the pull request gets approved I will try to add it.
Possibly need to adjust test as well when using this.

@mcabbott
Copy link
Contributor Author

we need to have these simple matrix APIs. There is no harm in having functions that accept arrays.

Yes, totally. My only objection is to having them as methods of Base.exp etc. This means that if some package you depend on happens to load VML, then your usage of exp elsewhere may suddenly change. It's not severe yet, in that you would have to be relying on exp(::Array) being an error, or else overloading this yourself. But still I think this is best avoided. And VML.exp is a perfectly fine name.

@mcabbott
Copy link
Contributor Author

need to adjust test as well

Yes, will try to take a look later. Perhaps simplest that this be another PR to merge just after yours, before tagging (or registering?). My branch is based off yours.

@Crown421
Copy link
Collaborator

Actually, I think the change needed for the tests is not too bad, I had a version to that effect briefly.

@aminya
Copy link
Member

aminya commented Nov 13, 2019

Oh, I see, so the whole point is to move it to a package and let the user call @overload? 😄 I was misunderstanding this.

@mcabbott
Copy link
Contributor Author

I thought AppleAccelerate.@replaceBase did something fancier, but in fact it's used just like @overload. Except that they thought about it for more than one minute and realised there are some two-arg functions, etc.

@Crown421
Copy link
Collaborator

I just had a look at your code, it looks great.

@aminya
Copy link
Member

aminya commented Nov 26, 2019

I think we can close this. Thank you for your involvement!
Closed by #17

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants