Skip to content

Faulty eigenvalues from GenericLinearAlgebra when comparing to LinearAlgebra and other software #63

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
johanbluecreek opened this issue May 5, 2020 · 3 comments · Fixed by #64
Labels

Comments

@johanbluecreek
Copy link

johanbluecreek commented May 5, 2020

My expectation of comparing the eigenvalues produced by GenericLinearAlgebra and LinearAlgebra is that they should be the same up to some error. I do not find this to be the case for some matrices.

Consider the following session:

julia> m = [1 2 1 1 1 1; 0 1 0 1 0 1; 1 1 2 0 1 0; 0 1 0 2 1 1; 1 1 1 0 2 0; 0 1 1 1 0 2]
6×6 Array{Int64,2}:
 1  2  1  1  1  1
 0  1  0  1  0  1
 1  1  2  0  1  0
 0  1  0  2  1  1
 1  1  1  0  2  0
 0  1  1  1  0  2

julia> using LinearAlgebra

julia> eigvals(m)
6-element Array{Float64,1}:
 0.09678807408844635
 0.9999999894632876
 0.9999999999999999
 1.000000010536712
 2.1939365664746306
 4.709275359436919

julia> using GenericLinearAlgebra

julia> evals = eigvals(big.(m))
6-element Array{Complex{BigFloat},1}:
 0.09678807408844671251478377594290576622468594995566734039578341791786382578280872 - 0.0im
 0.526941082983575125659836351236052467530884090152227094914001065732795390546863 + 0.0im
 1.000000000000000000000000000000000000000000000000000000000000000000000000000104 - 0.0im
 1.473058917016424874340163648763947532469115909847772905085998934267204609453197 + 0.0im
 2.193936566474630448256084556933203300255287310678896004216260729027605120603558 + 0.0im
 4.709275359436922839229131667123890933520026739365436655387955853054531053613529 + 0.0im

julia> (evals[2] + evals[4])/2
1.000000000000000000000000000000000000000000000000000000000000000000000000000035 + 0.0im

We can see that two eigenvalues do not come out as being close to 1. Hence GenericLinearAlgebra appear to produce faulty eigenvalues.

This behaviour is dependent on the precision used for BigFloat as well

julia> setprecision(64)
64

julia> eigvals(big.(m))
6-element Array{Complex{BigFloat},1}:
 0.0967880740884467127188 - 0.0im
  0.999999999999999999458 - 4.03274504367466385843e-10im
  0.999999999999999999458 + 4.03274504367466385843e-10im
   1.00000000000000000011 + 0.0im
   2.19393656647463044943 + 0.0im
    4.7092753594369228397 + 0.0im

julia> setprecision(121)
121

julia> eigvals(big.(m))
6-element Array{Complex{BigFloat},1}:
 0.096788074088446712514783775942905766097 - 0.0im
  0.57659174364529533508673210001597938362 + 0.0im
  0.99999999999999999999999999999999999812 + 0.0im
   1.4234082563547046649132678999840206168 + 0.0im
   2.1939365664746304482560845569332033081 + 0.0im
   4.7092753594369228392291316671238909251 + 0.0im

Various precision numbers gives different results, 55 different precisions appear to fail to produce the correct eigenvalues in 64:256:

julia> n = 0
0

julia> for i in 64:256
           setprecision(i)
           if !all(0.1 .> abs.(eigvals(big.(m)) .- [0.096, 1, 1, 1, 2.19, 4.71]))
               global n += 1;
           end
       end

julia> n
55

and when there is a mismatch it is not always the same numbers as for setprecision(256) (the default), as seen in the setprecision(121) example above.

So there appears to be an issue in the the method for eigvals() introduced by GenericLinearAlgebra.

We have also checked with Mathematica that this matrix has three eigenvalues equal to 1. The above was tested on Julia 1.4.0.

This issue was brought to my attention by Severin Lüst.

@andreasnoack
Copy link
Member

Thanks for the detailed bug report. I'll look into it.

@johanbluecreek
Copy link
Author

Wow! Thank you for the very quick fix. I ran the lines in my report again after an update and it is fixed now. Thank you very much,

@andreasnoack
Copy link
Member

You are welcome. The small reproducer you provided made it fairly simple to identify the bug and, thankfully, the fix was pretty simple.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
2 participants