[go: up one dir, main page]

Skip to content

Evaluates/Finds the least squared solutions of Ax=b for x with 3 different methods. Uses A and b from the svd-data.csv dataset.

License

Notifications You must be signed in to change notification settings

JacobAWilkins/Numerical-Methods

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Numerical Methods

Run

python NumericalMethods.py

Solve Linear Systems of Equations

Evaluates/Finds the least squared solutions of Ax=b for x with 3 different methods. A is an m x n matrix, x and b are 1 x n column vectors. Uses A and b from the svd-data.csv dataset.

  1. Singular Value Decomposition (A=U.S.V^t) regardless of A's shape or rank
x=sum(i=0, r){ (u_i^t.b / s_i).v_i } where r is the effective rank of A
  1. Normal Equations without inverse of A^t.A
A^t.A.x = A^t.b
  1. Normal Equations with inverse of A^t.A
x = (A^t.A)^-1.A^t.b

Finds the 2-norm of the residual vector for each.

Each method results in a different value for x. The value of x used to produce b is

x = [1, 1, 1, 1, 1, 1, 1]^t

Conclusion

The condition number of A^t.A is very high which mean any operations that used A^t.A are ill-conditioned, thus, inaccurate. This is why the solutions of x vary so much.

About

Evaluates/Finds the least squared solutions of Ax=b for x with 3 different methods. Uses A and b from the svd-data.csv dataset.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages