Skip to content

improve type inference in caching #396

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 14, 2023

Conversation

jishnub
Copy link
Member

@jishnub jishnub commented Mar 13, 2023

With this, the following (among others) becomes a small union:

julia> C = Conversion(Chebyshev(), Ultraspherical(1.0));

julia> @code_warntype ApproxFunBase.CachedOperator(BandedMatrix, C, padding=false)
MethodInstance for Core.kwcall(::NamedTuple{(:padding,), Tuple{Bool}}, ::Type{ApproxFunBase.CachedOperator}, ::Type{BandedMatrix}, ::ApproxFunBase.ConcreteConversion{Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Float64})
  from kwcall(::Any, ::Type{ApproxFunBase.CachedOperator}, ::Type{BandedMatrix}, op::Operator) @ ApproxFunBase ~/Dropbox/JuliaPackages/ApproxFunBase/src/Caching/banded.jl:1
Arguments
  _::Core.Const(Core.kwcall)
  @_2::NamedTuple{(:padding,), Tuple{Bool}}
  @_3::Type{ApproxFunBase.CachedOperator}
  @_4::Type{BandedMatrix}
  op::ApproxFunBase.ConcreteConversion{Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Float64}
Locals
  padding::Union{}
  @_7::Bool
Body::Union{ApproxFunBase.CachedOperator{Float64, BandedMatrix{Float64, Matrix{Float64}, Base.OneTo{Int64}}, ApproxFunBase.ConcreteConversion{Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Float64}, Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Tuple{Int64, Int64}}, ApproxFunBase.CachedOperator{Float64, BandedMatrix{Float64, Matrix{Float64}, Base.OneTo{Int64}}, ApproxFunBase.ConcreteConversion{Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Float64}, Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Tuple{Int64, Infinities.InfiniteCardinal{0}}}}
1 ──       Core.NewvarNode(:(padding))
│          Core.NewvarNode(:(@_7))
│    %3  = Core.isdefined(@_2, :padding)::Core.Const(true)
└───       goto #6 if not %3
2 ── %5  = Core.getfield(@_2, :padding)::Bool%6  = (%5 isa ApproxFunBase.Bool)::Core.Const(true)
└───       goto #4 if not %6
3 ──       goto #5
4 ──       Core.Const(:(%new(Core.TypeError, Symbol("keyword argument"), :padding, ApproxFunBase.Bool, %5)))
└───       Core.Const(:(Core.throw(%9)))
5 ┄─       (@_7 = %5)
└───       goto #7
6 ──       Core.Const(:(@_7 = false))
7 ┄─ %14 = @_7::Bool%15 = (:padding,)::Core.Const((:padding,))
│    %16 = Core.apply_type(Core.NamedTuple, %15)::Core.Const(NamedTuple{(:padding,)})
│    %17 = Base.structdiff(@_2, %16)::Core.Const(NamedTuple())
│    %18 = Base.pairs(%17)::Core.Const(Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}())
│    %19 = Base.isempty(%18)::Core.Const(true)
└───       goto #9 if not %19
8 ──       goto #10
9 ──       Core.Const(:(Base.kwerr(@_2, @_3, @_4, op)))
10%23 = ApproxFunBase.:(var"#CachedOperator#586")(%14, @_3, @_4, op)::Union{ApproxFunBase.CachedOperator{Float64, BandedMatrix{Float64, Matrix{Float64}, Base.OneTo{Int64}}, ApproxFunBase.ConcreteConversion{Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Float64}, Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Tuple{Int64, Int64}}, ApproxFunBase.CachedOperator{Float64, BandedMatrix{Float64, Matrix{Float64}, Base.OneTo{Int64}}, ApproxFunBase.ConcreteConversion{Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Float64}, Chebyshev{ChebyshevInterval{Float64}, Float64}, Ultraspherical{Float64, ChebyshevInterval{Float64}, Float64}, Tuple{Int64, Infinities.InfiniteCardinal{0}}}}
└───       return %23

@codecov
Copy link

codecov bot commented Mar 13, 2023

Codecov Report

Patch coverage: 88.88% and project coverage change: +3.60 🎉

Comparison is base (bd77b87) 68.07% compared to head (557c998) 71.68%.

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #396      +/-   ##
==========================================
+ Coverage   68.07%   71.68%   +3.60%     
==========================================
  Files          80       80              
  Lines        8417     8454      +37     
==========================================
+ Hits         5730     6060     +330     
+ Misses       2687     2394     -293     
Impacted Files Coverage Δ
src/ApproxFunBase.jl 83.33% <ø> (+12.50%) ⬆️
src/Spaces/Spaces.jl 65.43% <ø> (ø)
src/Operators/SubOperator.jl 83.16% <60.00%> (+4.45%) ⬆️
src/Caching/banded.jl 68.80% <100.00%> (+0.28%) ⬆️
src/Caching/bandedblockbanded.jl 96.77% <100.00%> (ø)
src/Caching/blockbanded.jl 81.95% <100.00%> (+0.27%) ⬆️
src/Domain.jl 69.11% <100.00%> (+5.88%) ⬆️
src/Multivariate/LowRankFun.jl 69.96% <100.00%> (+1.26%) ⬆️

... and 23 files with indirect coverage changes

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@jishnub jishnub merged commit 2de6b4c into JuliaApproximation:master Mar 14, 2023
@jishnub jishnub deleted the cachinginf branch March 14, 2023 04:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant