MTH 351 Spring 2019 Calendar

Note: Numbering below refers to Lectures. All sections below refer to sections from the class textbook [AH].
Homeworks with due Dates are in red.
Homeworks with due Dates are in red.
Worksheets with due Dates are in orange.

  1. Lecture 1. 4/1: Intro to Class. Start Chapter 2: Algorithms and Errors: Computational error, propagated error, truncation error, rounding error. Sec 2.1 Floating point numbers. IEEE SP and DP, normalized numbers.
    (1) For an intro to the subject of numerical analysis read

    (2) Read Chapter 1: We will not cover Chapter 1 in class. However, Taylor polynomials will be used throughout this course, so you will need to review them. Try the following problems from Chapter 1.

    Turn in the starred problems for credit. DUE, MON 4/8 IN CLASS
    HW # 1: Sec 1.1 # 4,5,6,10, Sec 1.2 # 4*, 14*, 21*, Sec 1.3 # 5,6,11

    (3) Start reading Chapter 2. Look at

    (4) Lab # 1: Do the Matlab Tutorial that is posted on Canvas. Turn in a hardcopy of your solutions in class on Friday, April 12(See Canvas page for details).


  2. Lecture 2. 4/3 Sec 2.1-2.2 Consequences of floating point numbers: overflow, underflow threshold.
  3. Lecture 3. 4/5 Review of syllabus. Sec 2.1-2.2 Consequences of floating point numbers: machine epsilon.
    Handed Worksheet # 1 out. Due on Monday

    Turn in the starred problems for credit. DUE, MON 4/15 IN CLASS
    HW # 2:
    • Sec 2.1: 1(acd), 3, 4*, 7,
    • Sec 2.2: 1(acd), 5e*, 6b, 11,
    • Sec 2.3: 4,6,11*
    • Sec 2.4: 4*

  4. Lecture 4. 4/8 Sec 2.1-2.2 Consequences of floating point numbers: overflow, underflow, rounding, chopping, Loss of Significance. HW # 2 Due on Monday, April 15 in class.
  5. Lecture 5. 4/10 Class Cancelled.
  6. Lecture 6. 4/12 Sec 2.2 Loss of Significance, Sec 2.3 Bounds on propagated error
    Do Worksheet # 2 for a simple example of FP addition, Due on Monday, April 15 in class.
    , Sec 2.4 Summation. How do you add n floating point numbers? What is the error in the result, what is the best order to add?
  7. Lecture 7. 4/15 Sec 2.4 Computational error: Summation. add n floating point numbers using rounding or chopping. What is the error in the result, what is the best order to add? See Slides , and in particular look at the example on page 5. Read Section 3.1 for Wednesday class. handed back HW #1 and WS # 1
    Lab # 2: Do the Matlab Tutorial that is posted on Canvas. You may turn this lab online on Canavs as a single pdf or turn in a hardcopy of your solutions in class on Monday, April 22 (See Canvas page for details).
  8. Lecture 8. 4/17 Sec 3.1 Rootfinding. Bisection, Newton, Secant. Started on Bisection, assumptions, algorithm, implementation.

    HW # 3: Due on Monday, April 29.
  9. Lecture 9. 4/19 Sec 3.1 Bisection, Analysis of number of iterations. Convergence of sequence of approximations of root (midpoints) to the root as n approaced infinity. Formula for the number of iterations for algorithm to stop with an approximation that is within tolerance.
    Worksheet # 3 due on Wednesday, April 24 in class.
  10. Lecture 10. 4/22 Sec 3.1 Bisection, Review. Sec 3.2 Newton's method for rootfinding, Newton iteration formula. Error analysis.
  11. Lecture 11. 4/24 Sec 3.2 Newton's method for rootfinding error analysis, quadratic convergence, stopping criteria.
  12. Lecture 12. 4/26 Sec 3.3 Newton's method wrap-up, Secant method, error analysis.
    Worksheet # 4 due on Wednesday, May 1 in class.
  13. Lecture 13. 4/29 Sec 3.1-3.3 Newton's method example. Comparison of Bisection, Newton, Secant.
  14. Lecture 14. 5/1 Sec 3.5 Multiple roots and failure of quadratic convergence of Newton. Fixing Newton in this case. Stability of roots. Definition of ill conditioned problems. Rootfinding as an illconditioned problem.
  15. Lecture 15. 5/3 MIDTERM REVIEW
    HW 4: DUE Wed, May 15. Please turn in solutions to the starred problems
    • Sec 3.3: 7*, 8*
    • Sec 3.4: 6, 7, 8*, 9*, 13, 14, 16
    • Sec 3.5: 3, 6*, 7, 8*


    Lab # 3: See Canvas for details. DUE, Friday, May 17.
  16. Lecture 16. 5/6 MIDTERM IN CLASS
  17. Lecture 17. 5/8 Sec 3.5 Stability of roots. Start reading Chapter 4.
  18. Lecture 18. 5/10 Sec 3.4 Fixed Point Iteration. Start reading Chapter 4.
  19. Lecture 19. 5/13 Sec 3.4 Fixed Point Iteration. Contraction mapping theorem. Newton's method as a fixed point iteration.
  20. Lecture 20. 5/15 Sec 4.1 Interpolation and approximation. Polynomial interpolation, Lagrange's method. Lagrange basis functions.
    HW5: Turn in neatly handwritten solutions or typed solutions on Canvas by Friday, May 24.
    • Sec 4.1: 8a*b, 11a*b, 12a*b, 14, 22, 28, 32
    • Sec 4.2: 14*, 18
    • Sec 4.3: 1, 10*, 11


    Lab # 4: See Canvas for details. DUE, Wednesday, May 29.
  21. Lecture 21. 5/17 Sec 4.1 Interpolation and approximation. Polynomial interpolation, Lagrange's method. Lagrange basis functions, examples of $n=1,2$, Kronecker delta functions.
    Worksheet # 5 due on Wednesday, May 22 in class or upload as pdf on Canvas. See Canvas for details.
  22. Lecture 22. 5/20 Sec 4.2 Error in Interpolation
  23. Lecture 23. 5/22 Sec 4.3 Cubic Splines Start reading Chapter 5, Sections 5.1-5.4
  24. Lecture 24. 5/24 Sec 4.7, 7.1: Least Squares data fitting. Standard basis, Chebyshev and Legendre basis. Next two weeks Chapter 5, Sections 5.1-5.4
    HW6: Turn in neatly handwritten solutions or typed solutions in class or upload a pdf on Canvas by Friday, June 7.

    WS # 6: See Canvas for the worksheet. Complete and turn in class or upload a pdf on Canvas by Friday, June 7.
  25. Lecture 25. 5/27 Memorial Day holiday.
  26. Lecture 26. 5/29 Sec 4.7: Least Squares continuous fitting. examples.
    WS # 7: Error Estimation and Extrapolation. See Canvas for the worksheet. Complete and turn in class or upload a pdf on Canvas by Wednesday, June 5.

    Lab # 5: See Canvas for details. DUE, Friday, June 7.
  27. Lecture 27. 5/31 Sec 5.1, 5.2 Numerical integration (quadrature) Trapezoidal rule, error in trapezoidal.
  28. Lecture 28. 6/3 Sec 5.1, 5.2 Simpson's rule and error in Simpson. Asymptotic error. Order of convergence.
  29. Lecture 29. 6/5 5.1,5.2 examples, Sec 5.3 Degree of precision of an integration rule, Gaussian integration, N =1 midpoint rule.
  30. Lecture 30. 6/7 Sec 5.3 Gaussian integration, N = 2 Review for Final