Skip to content

Make eval great again #2825

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jun 28, 2024
Merged

Make eval great again #2825

merged 6 commits into from
Jun 28, 2024

Conversation

ChrisRackauckas
Copy link
Member

Okay, this one needs a little bit of explanation. So in the stone ages we used eval.

https://github.com/SciML/ModelingToolkit.jl/blame/f9a837c775e15495921947f4995cf7ab97b5a45a/src/systems/diffeqs/abstractodesystem.jl#L122

All was good. But then that made world-age issues. All was bad.

But then GeneralizedGenerated came, which was then replaced by RuntimeGeneratedFunctions, which replaced our use of eval. And all was good.

https://github.com/SciML/ModelingToolkit.jl/blame/8c6b8249aed97f18a6de471cb5544689befd58e4/src/systems/diffeqs/abstractodesystem.jl#L131

This had no issues with world-age, so you can now use everything more smoothly.

However, RGFs didn't play nicely with precompilation. That was fixed by caching things in the dictionary of the module, and adding eval_module which was a required argument for this context to allow the user to properly share where the functions would be evaluated into. All was good again.

But then Julia v1.9 came around with package images. It turns out that package images attempt to cache the binaries, yay! But the data and information for an RGF is not in the binary. Oh no.

SciML/DiffEqProblemLibrary.jl@3444ffe

Remove precompilation for MTK support
Can revert later when this is fixed.

This was noticed in DiffEqProblemLibrary, but I subsequently forgot as all of the other v1.9 release chaos happened.

So now we are trying to precompile models and failing. What did we do wrong?

Well... there's a hard fix... and there's an easy fix. And the easy fix is nice and robust and has guarantees to work by Julia's compiler team itself. Awesome, so let's do that. That fix is... use eval.

So at first early in the package, we added the argument eval_expression for whether to take the MTK generated function expression and eval it. Eval was then replaced with GG and then RGFs. However, getting the expression was then replaced with ODEFunctionExpr, ODEProblemExpr, etc. Not only that, but the expression themselves were no longer directly returned but put inside of another function, and the reason for that other function is due to adding dispatches for handling splits etc.

https://github.com/SciML/ModelingToolkit.jl/blob/v9.20.0/src/systems/diffeqs/abstractodesystem.jl#L333-L337

    f_oop, f_iip = eval_expression ?
                   (drop_expr(@RuntimeGeneratedFunction(eval_module, ex)) for ex in f_gen) :
                   f_gen
    f(u, p, t) = f_oop(u, p, t)
    f(du, u, p, t) = f_iip(du, u, p, t)
    f(u, p::Tuple{Vararg{Number}}, t) = f_oop(u, p, t)
    f(du, u, p::Tuple{Vararg{Number}}, t) = f_iip(du, u, p, t)
    f(u, p::Tuple, t) = f_oop(u, p..., t)
    f(du, u, p::Tuple, t) = f_iip(du, u, p..., t)
    f(u, p::MTKParameters, t) = f_oop(u, p..., t)
    f(du, u, p::MTKParameters, t) = f_iip(du, u, p..., t)

But now obviously, this reads like nonsense then. eval_expression=false returns a Julia expression, which is then put inside of a Julia function, and if you actually tried to call it you get an error that Expr isn't callable. All the meanwhile, eval_expression=true doesn't actually eval the expression, because remember it used to eval it but that was replaced with RGFs.

So we have a eval_expression keyword argument that is undocumented and also pretty much nonsense, and I want to add a feature where instead of using RGFs I want to eval the function... :thinking_face:

So I re-re-cooped this keyword argument so it now means what it used to mean before it meant what it, i.e. eval_expression=true means we eval the expression. This means that eval_expression=false means we do the RGF thing, and that should be the default as that makes world-age work. But, we also have eval_module already, so we just eval into whatever module the user gives. If the user evals into their current module, then the functions in the generated code is exactly a standard Julia function, and it all works with package images.

So... what about the tests. Well we had tests on this, but that's as interesting of a story. If we flip the default, then the tests only test the RGF stuff, which they are then setup to do... actually correctly, in a sense. The no-eval was simply testing the parent module

@test parentmodule(typeof(ODEPrecompileTest.f_noeval_good.f.f_oop).parameters[2]) ==
      ODEPrecompileTest

and indeed the parent module is in the right module, it's just an Expr there and not really what we were looking for? So in a bizarre sense the tests actually passed for the Exprs that couldn't actually do anything. So I just added tests for the eval path to the precompile test.

Now it turns out the "precompile test" is actually just a test that we can put things in a module and it can work. It does not test the package image building, which is why the RGFs did not fail there. I am unsure how to do this on CI.

But, I think the tests are likely good enough for now, and this gives a good solution to folks wanting to precompile. We should make sure it actually gets documented this time around.

Okay, this one needs a little bit of explanation. So in the stone ages we used `eval`.

https://github.com/SciML/ModelingToolkit.jl/blame/f9a837c775e15495921947f4995cf7ab97b5a45a/src/systems/diffeqs/abstractodesystem.jl#L122

All was good. But then that made world-age issues. All was bad.

But then GeneralizedGenerated came, which was then replaced by RuntimeGeneratedFunctions, which replaced our use of eval. And all was good.

https://github.com/SciML/ModelingToolkit.jl/blame/8c6b8249aed97f18a6de471cb5544689befd58e4/src/systems/diffeqs/abstractodesystem.jl#L131

This had no issues with world-age, so you can now use everything more smoothly.

However, RGFs didn't play nicely with precompilation. That was fixed by caching things in the dictionary of the module, and adding `eval_module` which was a required argument for this context to allow the user to properly share where the functions would be evaluated into. All was good again.

But then Julia v1.9 came around with package images. It turns out that package images attempt to cache the binaries, yay! But the data and information for an RGF is not in the binary. Oh no.

SciML/DiffEqProblemLibrary.jl@3444ffe

```
Remove precompilation for MTK support
Can revert later when this is fixed.
```

This was noticed in DiffEqProblemLibrary, but I subsequently forgot as all of the other v1.9 release chaos happened.

So now we are trying to precompile models and failing. What did we do wrong?

Well... there's a hard fix... and there's an easy fix. And the easy fix is nice and robust and has guarantees to work by Julia's compiler team itself. Awesome, so let's do that. That fix is... use eval.

So at first early in the package, we added the argument `eval_expression` for whether to take the MTK generated function expression and eval it. Eval was then replaced with GG and then RGFs. However, getting the expression was then replaced with ODEFunctionExpr, ODEProblemExpr, etc. Not only that, but the expression themselves were no longer directly returned but put inside of another function, and the reason for that other function is due to adding dispatches for handling splits etc.

https://github.com/SciML/ModelingToolkit.jl/blob/v9.20.0/src/systems/diffeqs/abstractodesystem.jl#L333-L337

```julia
    f_oop, f_iip = eval_expression ?
                   (drop_expr(@RuntimeGeneratedFunction(eval_module, ex)) for ex in f_gen) :
                   f_gen
    f(u, p, t) = f_oop(u, p, t)
    f(du, u, p, t) = f_iip(du, u, p, t)
    f(u, p::Tuple{Vararg{Number}}, t) = f_oop(u, p, t)
    f(du, u, p::Tuple{Vararg{Number}}, t) = f_iip(du, u, p, t)
    f(u, p::Tuple, t) = f_oop(u, p..., t)
    f(du, u, p::Tuple, t) = f_iip(du, u, p..., t)
    f(u, p::MTKParameters, t) = f_oop(u, p..., t)
    f(du, u, p::MTKParameters, t) = f_iip(du, u, p..., t)
```

But now obviously, this reads like nonsense then. `eval_expression=false` returns a Julia expression, which is then put inside of a Julia function, and if you actually tried to call it you get an error that Expr isn't callable. All the meanwhile, `eval_expression=true` doesn't actually `eval` the expression, because remember it used to eval it but that was replaced with RGFs.

So we have a `eval_expression` keyword argument that is undocumented and also pretty much nonsense, and I want to add a feature where instead of using RGFs I want to eval the function... :thinking_face:

So I re-re-cooped this keyword argument so it now means what it used to mean before it meant what it, i.e. `eval_expression=true` means we `eval` the expression. This means that `eval_expression=false` means we do the RGF thing, and that should be the default as that makes world-age work. But, we also have `eval_module` already, so we just eval into whatever module the user gives. If the user evals into their current module, then the functions in the generated code is exactly a standard Julia function, and it all works with package images.

So... what about the tests. Well we had tests on this, but that's as interesting of a story. If we flip the default, then the tests only test the RGF stuff, which they are then setup to do... actually correctly, in a sense. The no-eval was simply testing the parent module

```julia
@test parentmodule(typeof(ODEPrecompileTest.f_noeval_good.f.f_oop).parameters[2]) ==
      ODEPrecompileTest
```

and indeed the parent module is in the right module, it's just an Expr there and not really what we were looking for? So in a bizarre sense the tests actually passed for the Exprs that couldn't actually do anything. So I just added tests for the eval path to the precompile test.

Now it turns out the "precompile test" is actually just a test that we can put things in a module and it can work. It does not test the package image building, which is why the RGFs did not fail there. I am unsure how to do this on CI.

But, I think the tests are likely good enough for now, and this gives a good solution to folks wanting to precompile. We should make sure it actually gets documented this time around.
@ChrisRackauckas ChrisRackauckas merged commit 80def4c into master Jun 28, 2024
22 of 23 checks passed
@ChrisRackauckas ChrisRackauckas deleted the eval branch June 28, 2024 01:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant