Skip to content

Commit 1c46475

Browse files
committed
Flesh out docs
1 parent af673c5 commit 1c46475

File tree

2 files changed

+205
-3
lines changed

2 files changed

+205
-3
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@ A Julia package for classical orthogonal polynomials and expansions
33

44
[![Build Status](https://github.com/JuliaApproximation/ClassicalOrthogonalPolynomials.jl/workflows/CI/badge.svg)](https://github.com/JuliaApproximation/ClassicalOrthogonalPolynomials.jl/actions)
55
[![codecov](https://codecov.io/gh/JuliaApproximation/ClassicalOrthogonalPolynomials.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/JuliaApproximation/ClassicalOrthogonalPolynomials.jl)
6+
[![](https://img.shields.io/badge/docs-latest-blue.svg)](https://JuliaApproximation.github.io/ClassicalOrthogonalPolynomials.jl/latest)
67

78

89
This package implements classical orthogonal polynomials.

docs/src/index.md

Lines changed: 204 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,217 @@
11
# ClassicalOrthogonalPolynomials.jl
22
A Julia package for classical orthogonal polynomials and expansions
33

4+
## Definitions
5+
6+
We follow the [Digital Library of Mathematical Functions](https://dlmf.nist.gov/18.3),
7+
which defines the following classical orthogonal polynomials:
8+
9+
1. Legendre: $P_n(x)$
10+
2. Chebyshev (1st kind, 2nd kind): $T_n(x)$, $U_n(x)$
11+
3. Ultraspherical: $C_n^{(λ)}(x)$
12+
4. Jacobi: $P_n^{(a,b)}(x)$
13+
5. Laguerre: $L_n^{(α)}(x)$
14+
6. Hermite: $H_n(x)$
15+
416
## Evaluation
517

618
The simplest usage of this package is to evaluate classical
719
orthogonal polynomials:
820
```jldoctest
921
julia> using ClassicalOrthogonalPolynomials
1022
11-
julia> chebyshevt(5, 0.1) # T_5(0.1) == cos(5acos(0.1))
23+
julia> n, x = 5, 0.1;
24+
25+
julia> legendrep(n, x) # P_n(x)
26+
0.17882875
27+
28+
julia> chebyshevt(n, x) # T_n(x) == cos(n*acos(x))
1229
0.48016
1330
14-
julia> chebyshevu(5, 0.1) # U_5(0.1) == sin(6acos(0.1))/sin(acos(0.1))
31+
julia> chebyshevu(n, x) # U_n(x) == sin((n+1)*acos(x))/sin(acos(x))
1532
0.56832
16-
```
33+
34+
julia> λ = 0.3; ultrasphericalc(n, λ, x) # C_n^(λ)(x)
35+
0.08578714248
36+
37+
julia> a,b = 0.1,0.2; jacobip(n, a, b, x) # P_n^(a,b)(x)
38+
0.17459116797117194
39+
40+
julia> laguerrel(n, x) # L_n(x)
41+
0.5483540833333331
42+
43+
julia> α = 0.1; laguerrel(n, α, x) # L_n^(α)(x)
44+
0.732916666666666
45+
46+
julia> hermiteh(n, x) # H_n(x)
47+
11.84032
48+
```
49+
50+
## Continuum arrays
51+
52+
For expansions, recurrence relationships, and other operations linked with linear equations, it is useful to treat the families of orthogonal
53+
polynomials as _continuum arrays_, as implemented in [ContinuumArrays.jl](https://github.com/JuliaApproximation/ContinuumArrays.jl). The continuum arrays implementation is accessed as follows:
54+
```jldoctest
55+
julia> T = ChebyshevT() # Or just Chebyshev()
56+
ChebyshevT()
57+
58+
julia> axes(T) # [-1,1] by 1:∞
59+
(Inclusion(-1.0..1.0 (Chebyshev)), OneToInf())
60+
61+
julia> T[x, n+1] # T_n(x) = cos(n*acos(x))
62+
0.48016
63+
```
64+
We can thereby access many points and indices efficiently using array-like language:
65+
```jldoctest
66+
julia> x = range(-1, 1; length=1000);
67+
68+
julia> T[x, 1:1000] # [T_j(x[k]) for k=1:1000, j=0:999]
69+
1000×1000 Matrix{Float64}:
70+
1.0 -1.0 1.0 -1.0 1.0 -1.0 1.0 … -1.0 1.0 -1.0 1.0 -1.0
71+
1.0 -0.997998 0.992 -0.98203 0.968128 -0.95035 0.928766 -0.99029 0.979515 -0.964818 0.946258 -0.92391
72+
1.0 -0.995996 0.984016 -0.964156 0.936575 -0.901494 0.859194 -0.448975 0.367296 -0.282676 0.195792 -0.107341
73+
1.0 -0.993994 0.976048 -0.946378 0.90534 -0.853427 0.791262 0.660163 -0.738397 0.807761 -0.867423 0.916664
74+
1.0 -0.991992 0.968096 -0.928695 0.874421 -0.806141 0.72495 -0.942051 0.892136 -0.827934 0.750471 -0.660989
75+
1.0 -0.98999 0.96016 -0.911108 0.843816 -0.75963 0.660237 … 0.891882 -0.946786 0.982736 -0.999011 0.995286
76+
1.0 -0.987988 0.952241 -0.893616 0.813524 -0.713888 0.597101 0.905338 -0.828835 0.73242 -0.618409 0.489542
77+
⋮ ⋮ ⋱ ⋮
78+
1.0 0.987988 0.952241 0.893616 0.813524 0.713888 0.597101 -0.905338 -0.828835 -0.73242 -0.618409 -0.489542
79+
1.0 0.98999 0.96016 0.911108 0.843816 0.75963 0.660237 -0.891882 -0.946786 -0.982736 -0.999011 -0.995286
80+
1.0 0.991992 0.968096 0.928695 0.874421 0.806141 0.72495 … 0.942051 0.892136 0.827934 0.750471 0.660989
81+
1.0 0.993994 0.976048 0.946378 0.90534 0.853427 0.791262 -0.660163 -0.738397 -0.807761 -0.867423 -0.916664
82+
1.0 0.995996 0.984016 0.964156 0.936575 0.901494 0.859194 0.448975 0.367296 0.282676 0.195792 0.107341
83+
1.0 0.997998 0.992 0.98203 0.968128 0.95035 0.928766 0.99029 0.979515 0.964818 0.946258 0.92391
84+
1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0
85+
```
86+
87+
## Expansions
88+
89+
We view a function expansion in say Chebyshev polynomials in terms of continuum arrays as follows:
90+
$$
91+
f(x) = \sum_{k=0}^∞ c_k T_k(x) = \begin{bmatrix}T_0(x) | T_1(x) | … \end{bmatrix}
92+
\begin{bmatrix}c_0\\ c_1 \\ \vdots \end{bmatrix} = T[x,:] * 𝐜
93+
$$
94+
To be more precise, we think of functions as continuum-vectors. Here is a simple example:
95+
```jldoctest
96+
julia> f = T * [1; 2; 3; zeros(∞)]; # T_0(x) + T_1(x) + T_2(x)
97+
98+
julia> f[0.1]
99+
-1.74
100+
```
101+
To find the coefficients for a given function we consider this as the problem of finding $𝐜$
102+
such that $T*𝐜 == f$, that is:
103+
```julia
104+
julia> T \ f
105+
vcat(3-element Vector{Float64}, ℵ₀-element FillArrays.Zeros{Float64, 1, Tuple{InfiniteArrays.OneToInf{Int64}}} with indices OneToInf()) with indices OneToInf():
106+
1.0
107+
2.0
108+
3.0
109+
110+
111+
112+
113+
114+
```
115+
For a function given only pointwise we broadcast over `x`, e.g., the following are the coefficients of $\exp(x)$:
116+
```julia
117+
julia> x = axes(T, 1);
118+
119+
julia> c = T \ exp.(x)
120+
vcat(14-element Vector{Float64}, ℵ₀-element FillArrays.Zeros{Float64, 1, Tuple{InfiniteArrays.OneToInf{Int64}}} with indices OneToInf()) with indices OneToInf():
121+
1.2660658777520084
122+
1.1303182079849703
123+
0.27149533953407656
124+
0.04433684984866379
125+
0.0054742404420936785
126+
0.0005429263119139232
127+
4.497732295427654e-5
128+
129+
130+
julia> f = T*c; f[0.1] # ≈ exp(0.1)
131+
1.1051709180756477
132+
```
133+
With a little cheeky usage of Julia's order-of-operations this can be written succicently as:
134+
```julia
135+
julia> f = T / T \ exp.(x);
136+
137+
julia> f[0.1]
138+
1.1051709180756477
139+
```
140+
(Or for more clarity just write `T * (T \ exp.(x))`.)
141+
142+
143+
## Jacobi matrices
144+
145+
Orthogonal polynomials satisfy well-known three-term recurrences:
146+
$$
147+
x p_n(x) = c_{n-1} p_{n-1}(x) + a_n p_n(x) + b_n p_{n+1}(x).
148+
$$
149+
In continuum-array language this has the form of a comuting relationship:
150+
$$
151+
x \begin{bmatrix} p_0 | p_1 | \cdots \end{bmatrix} = \begin{bmatrix} p_0 | p_1 | \cdots \end{bmatrix} \begin{bmatrix} a_0 & c_0 \\ b_0 & a_1 & c_1 \\ & b_1 & a_2 & \ddots \\ &&\ddots & \ddots \end{bmatrix}
152+
$$
153+
We can therefore find the Jacobi matrix naturally as follows:
154+
```julia
155+
julia> T \ (x .* T)
156+
ℵ₀×ℵ₀ LazyBandedMatrices.Tridiagonal{Float64, LazyArrays.ApplyArray{Float64, 1, typeof(vcat), Tuple{Float64, FillArrays.Fill{Float64, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}}}, FillArrays.Zeros{Float64, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}, FillArrays.Fill{Float64, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}} with indices OneToInf()×OneToInf():
157+
0.0 0.5
158+
1.0 0.0 0.5
159+
0.5 0.0 0.5
160+
0.5 0.0 0.5
161+
0.5 0.0 0.5
162+
0.5 0.0 0.5
163+
0.5 0.0 0.5
164+
165+
```
166+
Alternatively, just call `jacobimatrix(T)` (noting its the transpose of the more traditional convention).
167+
168+
169+
## Derivatives
170+
171+
The derivatives of classical orthogonal polynomials are also classical OPs, and this can be seen as follows:
172+
```julia
173+
julia> U = ChebyshevU();
174+
175+
julia> D = Derivative(x);
176+
177+
julia> U\D*T
178+
ℵ₀×ℵ₀ BandedMatrix{Float64} with bandwidths (-1, 1) with data 1×ℵ₀ adjoint(::InfiniteArrays.InfStepRange{Float64, Float64}) with eltype Float64 with indices Base.OneTo(1)×OneToInf() with indices OneToInf()×OneToInf():
179+
1.0
180+
2.0
181+
3.0
182+
4.0
183+
5.0
184+
6.0
185+
7.0
186+
187+
```
188+
Similarly, the derivative of _weighted_ classical OPs are weighted classical OPs:
189+
```julia
190+
lia> Weighted(T)\D*Weighted(U)
191+
ℵ₀×ℵ₀ BandedMatrix{Float64} with bandwidths (1, -1) with data 1×ℵ₀ adjoint(::InfiniteArrays.InfStepRange{Float64, Float64}) with eltype Float64 with indices Base.OneTo(1)×OneToInf() with indices OneToInf()×OneToInf():
192+
193+
-1.0
194+
-2.0
195+
-3.0
196+
-4.0
197+
-5.0
198+
-6.0
199+
200+
```
201+
202+
# Other recurrence relationships
203+
204+
Many other sparse recurrence relationships are implemented. Here's one:
205+
```julia
206+
julia> U\T
207+
ℵ₀×ℵ₀ BandedMatrix{Float64} with bandwidths (0, 2) with data vcat(1×ℵ₀ FillArrays.Fill{Float64, 2, Tuple{Base.OneTo{Int64}, InfiniteArrays.OneToInf{Int64}}} with indices Base.OneTo(1)×OneToInf(), 1×ℵ₀ FillArrays.Zeros{Float64, 2, Tuple{Base.OneTo{Int64}, InfiniteArrays.OneToInf{Int64}}} with indices Base.OneTo(1)×OneToInf(), hcat(1×1 Ones{Float64}, 1×ℵ₀ FillArrays.Fill{Float64, 2, Tuple{Base.OneTo{Int64}, InfiniteArrays.OneToInf{Int64}}} with indices Base.OneTo(1)×OneToInf()) with indices Base.OneTo(1)×OneToInf()) with indices Base.OneTo(3)×OneToInf() with indices OneToInf()×OneToInf():
208+
1.0 0.0 -0.5
209+
0.5 0.0 -0.5
210+
0.5 0.0 -0.5
211+
0.5 0.0 -0.5
212+
0.5 0.0 -0.5
213+
0.5 0.0 -0.5
214+
0.5 0.0 -0.5
215+
216+
```
217+
(Probably best to ignore the type signature 😅)

0 commit comments

Comments
 (0)