MATH 304-4 Fall 2019 SYLLABUS for Prof. Feingold's Section 4

Contact Information

My office is Room 115 in Whitney Hall. My office phone number is 777-2465 and my email address is My office hours are MWF 12:00 - 1:00 and by appointment.


``Linear Algebra and Its Applications, 5th Edition" by David C. Lay, Steven R. Lay, Judi J. McDonald, Pearson Publishers, e-book with MyLab Math.

Here is a link to the Pearson webpage for this textbook, including information about MyLab Math homework system: Lay-Lay-McDonald webpage.

The following pdf file has instructions for students to register in MyLab for Feingold's Section 4: MyLab registration instructions for Feingold's Section 4.

The entire book will be covered if time permits. A list of major topics which may be covered is given at the end of this page.

Exams, Quizzes and Point Values

The six sections of this course will be run separately by each instructor. Only the Final Exam will be common for all sections. This page contains details relevant only to my Section 4, but some general advice probably applies to everyone. There may be quizzes administered in class or through the MyLab online system which would count for 50 points. Three 90-minute exams will be administered on announced dates during a normal class time. There will be one common Final Exam (2 hours long) during the scheduled Finals period. Each 90-minute exam will be worth 100 points, and the Final Exam will be worth 150 points. The material being tested in each exam will be determined and announced approximately one week before the exam. The Final Exam will be comprehensive, covering the whole course. ANYONE UNABLE TO TAKE AN EXAM SHOULD CONTACT THE INSTRUCTOR AHEAD OF TIME TO EXPLAIN THE REASON. A MESSAGE CAN BE LEFT AT THE MATH DEPT OFFICE (777-2147) OR ON PROFESSOR FEINGOLD'S VOICEMAIL (777-2465). NO ONE SHOULD MISS THE FINAL!

Information about the scheduling of exams is posted on the main Math 304 page.

After each Exam is graded and returned, solutions will be posted here, along with a letter grade interpretation of the numerical score.

The Section 4 Exam 1 questions and solutions can be downloaded from the following link: Exam 1 and Solutions for Feingold's Section 4. The average on Exam 1 in Section 4 was 58. The letter grade interpretation of the numerical score is: A: 90 - 100, A-: 85 - 89, B+: 80 - 84, B: 74 - 79, B-: 68 - 73, C+: 60 - 67, C: 52 - 59, C-: 46 - 51, D: 38 - 45, F: 0 - 37.

The Section 4 Exam 2 questions and solutions can be downloaded from the following link: Exam 2 and Solutions for Feingold's Section 4. The average on Exam 2 in Section 4 was 47. If the average is computed only on scores above 30, then the average was 55.75. The letter grade interpretation of the numerical score is: A: 90 - 120, A-: 82 - 89, B+: 75 - 81, B: 68 - 74, B-: 62 - 67, C+: 55 - 61, C: 45 - 54, C-: 38 - 44, D: 30 - 37, F: 0 - 29.

I am providing a practice exam 3 for students in my Section 4 through the following link: Practice Exam 3 and Solutions for Feingold's Section 4.

The Section 4 Exam 3 questions and solutions can be downloaded from the following link: Exam 3 and Solutions for Feingold's Section 4. The average on Exam 3 in Section 4 was 51.28. The letter grade interpretation of the numerical score is: A: 90 - 100, A-: 85 - 89, B+: 78 - 84, B: 70 - 77, B-: 64 - 69, C+: 56 - 63, C: 48 - 55, C-: 40 - 47, D: 32 - 39, F: 0 - 31.

A graphic representation of the Final Exam scores from all sections can be seen in the following pdf file: Final Exam scores from all sections. It shows that the average final exam score over all sections was 81.81 and the letter grade interpretation of the numerical score was: A: 130 - 150, A-: 120 - 129, B+: 110 - 119, B: 100 - 109, B-: 90 - 99, C+: 80 - 89, C: 70 - 79, C-: 60 - 69, D: 50 - 59, F: 0 - 49.

A graphic representation of the Total Scores (0-500) from my section 4 can be seen in the following pdf file: Totals Curve for Section 4. It shows that the average total point score for section 4 was 275 and the letter grade interpretation of the numerical score was: A: 450 - 500, A-: 420 - 449, B+: 400 - 419, B: 370 - 399, B-: 340 - 369, C+: 310 - 339, C: 275 - 309, C-: 250 - 274, D: 200 - 249, F: 0 - 199.

Any student with a special problem or a finals conflict must contact the instructor (me) as soon as possible to make arrangements.

There are links to practice problems of various kinds on Prof. Mazur's Math 304 webpage.

Determination of Letter Grades

The numerial score on each exam will be given a letter grade interpretation, giving each student a letter grade as well as a number grade, and the Total of all points earned will also be given a letter grade interpretation. The letter grades on the exams indicate how a student is doing, and will be taken into consideration in making the interpretation for the Totals. The course grade will be determined by the interpretation of Total points earned. Only borderline cases may be subject to further adjustment based on homework as determined by the instructor. Any cases of cheating will be subject to investigation by the Academic Honesty Committee of Harpur College.

General Comments

Class attendance is required at all scheduled meetings, and sleeping in class does not count as being there. Questions are welcomed at any time during a lecture. At the start of each class be ready to ask questions about homework problems or about the previous lecture. We want to create an atmosphere where you all feel very free to ask questions and make comments. If anyone feels that an instructor has not answered a question clearly, completely, and with respect and consideration for the student who asked it, please let your instructor know about it immediately so he/she can correct the problem. You can do this in class or in office hours, verbally or in writing, on paper or by email, or by whatever means makes you most comfortable, but with enough detail that your instructor can understand what you think was done wrong. It will be too late to help if you only complain at the end of the course. If you are not satisfied by the response of your instructor, please contact the course coordinator, Prof. Alex Feingold.

The material is a combination of theory and calculation, and it is necessary to understand the theory in order to do sensible calculations and interpret them correctly. There is a significant difference between training and education, and we feel strongly that our goal at this university is to educate you, not just to train you to do computations. Theory is not presented to impress you with our knowledge of the subject, but to give you the depth of understanding expected of an adult with a university education in this subject. Some of your instructors have many years of experience teaching mathematics at the university level, but it will require your consistent concentrated study to master this material. While much learning can take place in the classroom, a significant part of it must be done by you outside of class. Using the book, class notes, homework exercises, only you can achieve success in this course. Students who do not take this course seriously, who do not take this advice, are not likely to be rewarded at the end. We are here to help and guide you, and we also grade the exams to judge how much you have learned, but grades are earned by you, not given by us. Exams will be a combination of theory questions and calculations appropriate for a course of this level.


For each section of material covered there will be an assignment of problems from the textbook. Homework will be handled through the online system MyLab Math, and information about how to access it has been provided above and on the main Math 304 webpage. The number of homeworks attempted may be considered as a factor in determining your course grade if you are a borderline case in the letter grade interpretation of the Total score.

Course Contents

  • Systems of Linear Equations
  • Solution by row reduction
  • Matrices and operations with them
  • Reduced Row Echelon Form
  • Sets of Matrices of size mxn
  • Functions (general theory), injective, surjective, bijective, invertible, composition
  • Linear Functions determined by a matrix
  • Abstract Vector spaces
  • Basic theorems, examples, subspaces, linear combinations, span of a set of vectors
  • Linear functions L : V ---> W between vector spaces
  • Kernel and Range of a linear function
  • Connection with injective, surjective, bijective, invertible, composition of linear functions, isomorphism
  • Matrix multiplication from composition, formulas, properties (associativity)
  • Row (and column) operations achieved by left (right) matrix multiplication by elementary matrices
  • Theorems about invertibility of a square matrix (if row reduces to the identity matrix), algorithm to compute inverse
  • Linear independence/dependence, removing redundant vectors from a list keeping span the same
  • Basis (independent spanning set), Theorems about basis
  • Dimension of a vector space (or subspace), rank, nullity, theorems about dimension
  • For L : V ---> W, dim(V) = dim(Ker(L)) + dim(Range(L)) and its applications
  • Coordinates as an isomorphism from V to nx1 matrices
  • Representing a linear function L : V ---> W by a matrix (with respect to choice of basis S of V and basis T of W)
  • Theorems and algorithms, how matrix representing L changes when bases change to S' and T'
  • Equivalence of matrices, Block Identity Form
  • Study special case of L : V ---> V using same basis S on both ends, linear operators
  • Effect of change of basis on matrix representing a linear operator, similarity of matrices
  • Investigate when L might be represented by a diagonal matrix
  • Eigenvectors, Eigenvalues
  • Determinants as a tool for finding eigenvalues, general theorems and properties about determinants
  • det(AB) = det(A) det(B), A invertible iff det(A) not zero
  • Characteristic polynomial of a matrix, det(A - t I), roots are eigenvalues
  • Similar matrices have same characteristic polynomial
  • Geometric and algebraic multiplicities of eigenvalues for L : V ---> V (or for matrix A representing L)
  • Theorems (geometric mult less than or equal to algebraic mult), L diagonalizable iff geom mult = alg mult for all eigenvalues
  • Computational techniques to find a basis of eigenvectors, diagonalization of matrix A, if possible
  • Geometry in Linear Algebra: dot product, angles and lengths, orthogonality, orthonormal sets, orthogonal projections
  • Orthogonal matrices

    Extra topics if time allows (usually not enough time for these in the elementary linear course) :

  • Quadratic forms and associated bilinear forms on a vector space V
  • Matrix representing a bilinear form with respect to a choice of basis S
  • Effect of change of basis on matrix representing a bilinear form
  • Classification of quadratic forms

    Topics which may be covered on each exam

    These paragraphs will be updated before each exam is given.

    Topics which may be covered on Exam 1

    Here is a list of topics covered in lectures which may be covered on Exam 1.

    Linear Systems, solving by row reduction of the augmented matrix [A|B] and interpretation in terms of free and dependent variables.

    Consistent vs. inconsistent systems. Homogeneous systems AX=O.

    Elementary row operations and reduction to Reduced Row Echelon Form (RREF).

    Matrices, the set of all mxn real matrices, Rmn, addition of matrices, multiplication of a matrix by a real number (scalar).

    Matrix shapes, names of special patterns.

    Rank of a matrix.

    How an mxn matrix A determined a function LA: Rn --> Rm.

    Properties of general functions: one-to-one (injective), onto (surjective), both (bijective), invertible.

    Connection between properties of matrix A and function LA.

    Abstract definition of a real vector space, V. Examples, Rmn is a vector space. For any set S, the set F = {f : S ---> R} of all functions from S to the reals R, is a vector space.

    Definition of a linear transformation L : V ---> W from a vector space to a vector space. Ker(L), Range(L) = Im(L).

    Basic facts about vector spaces and about linear transformations, and examples.

    Defintion of matrix multiplication AB through the definition LA composed with LB equals LAB. Lemma that LA = LB iff A=B.

    Defintion of standard basis vectors e1, ... , en in Rn and lemma that Aej = Colj(A), so AX is the sum of xj Colj(A).

    Definition of positive powers of a square matrix A, and positive powers of L: V ---> V. Definition of transpose of a matrix.

    Definition and some examples of subspaces.

    Topics which may be covered on Exam 2

    Topics that have been covered since Exam 1 and which may appear on Exam 2 are listed below.

    Definition of when a square matrix is invertible, uniqueness of the inverse when it exists, and an algorithm to decide and find it by row reduction of [A | In].

    Elementary matrices and how they can be used to achieve elementary row or column operations.

    The rules of matrix algebra.

    The span of a set of vectors S in a vector space V, and why it forms a subspace of V.

    How to check that a subset W in V is a subspace of V.

    Linear indepdendence or dependence of a subset of V, definition and method of determining that.

    Theorems and examples about spanning and independence, connection with rank of a matrix.

    Definition of a basis for a vector space, and how to decide if a subset is a basis of V.

    Finding a basis for important examples of subspaces, Ker(L), Range(L), where L:V---> W is a linear map.

    Use of a basis S of V to give coordinates with respect to S for each vector v in V. How that coordinate function, [v]S, is a linear map from V to Rn when a basis S for V consists of n vectors.

    Transition matrices which give the relationship between the coordinates of a vector v with respect to different bases. If S and T are two bases of the same vector space, V, then the transition matrix from S to T is the square invertible matrix TPS such that [v]T = TPS [v]S.

    Dimension of V as the number of vectors in any basis for V.

    The standard basis for several examples of vector spaces, including all the Rmn examples.

    Row-space and Column-space of a matrix, and their dimension related to the rank of the matrix.

    Information about the linear transformation LA: Rn--> Rm associated with rank(A).

    The relationship between the dimensions of Ker(L), Range(L) and V for L:V---> W.

    Extending an independent set to a basis, cutting down a spanning set to a basis.

    How to represent a general linear map L:V---> W with respect to a choice of basis S in V and basis T in W by a matrix, that is,

    using coordinates with respect to S, [ . ]S, and coordinates with respect to T, [ . ]T, to find a matrix T[L]S, such that T[L]S [v]S = [L(v)]T.

    The algorithm for finding that matrix by a row reduction of [T | L(S)].

    If S and S' are two bases of V, and T and T' are two bases of W, and L:V---> W then there is a relationship between T[L]S, the matrix representing L from S to T, and T'[L]S', the matrix representing L from S' to T'.

    That relationship is T'[L]S' = T'QT T[L]S SPS' where SPS' is the transition matrix from S' to S, and T'QT is the transition matrix from T to T'.

    The concept of isomorphism (bijective linear map) and its properties.

    Material on determinants, their definition using permutations or by cofactor expansion, their properties, and methods of calculating them (definition by permutations or by cofactor expansions, crosshatching method for matrices of size n = 2 or n = 3 ONLY, using row operations).

    The use of determinant to get the characteristic polynomial, det(A - tIn), whose roots give the eigenvalues of A, and whose expression as a product of powers of distinct linear factors gives the algebraic multiplicities.

    Eigenspaces, their properties, and how to decide if a matrix can be diagonalized or not. Theorems about eigenspaces and diagonalizability.

    Independence of the union of bases for distinct eigenspaces.

    Geometric multiplicity and its relationship to algebraic multiplicity for each eigenvalue.

    Topics which may be covered on Exam 3

    Eigenspaces and diagonalization may be tested again.

    Section 6.1: The standard dot (inner) product in Rn, length ||v|| and distance ||u-v||, orthogonality: u is perpendicular to v iff u.v = 0, S and W, and the theorem that S = (span(S)). cos(\theta_{u,v}) = (u.v)/(||u|| ||v||) formula for cosine of the angle between two vectors in Rn using dot products.

    Section 6.2: Orthogonal and orthonormal subsets, projection of a vector onto another vector, Projv(u) = (u.v)/(v.v) v. The geometrical meaning of projection may be helpful but I would not test it on an exam. Skip decomposing forces.

    Section 6.3: The projection ṽ = ProjW(v) of vector v into subspace W of Rn defined to be a vector in W such that v - ṽ is orthogonal to W. If T = {w1, ..., wm} is any basis of W, then ṽ = ∑i=1m xi wi must satisfy (v - ṽ).wj = 0 for j = 1,..., m, that is the linear system ∑i=1m xi (wi.wj) = v.wj for j = 1,..., m. Define the mxm symmetric matrix A = [wi.wj] and the mx1 column matrix B = [v.wj]. This system is just AX = B, and it can be shown that A is invertible, so it always has unique solution X = A-1 B which gives the projection. When basis T is orthogonal, the matrix A is diagonal and the solution is easy giving an explicit formula ProjW(v) = ∑i=1m (v.wi)/(wi.wi) w_i which is even simpler with T orthonormal. The Best Approximation Theorem, which says that ṽ = ProjW(v) is the unique vector in W such that ||v - ṽ|| < ||v - w|| for any w in W distinct from ṽ. Skip Theorem 10.

    Section 6.4: Gram-Schmidt process but skip QR factorization.

    Section 6.5: Skip Least Squares completely.

    Section 6.6: Skip Applications to Linear Models completely.

    Section 6.7: Inner Product Space (IPS) and the Cauchy-Schwarz inequality, including the examples of function spaces C[a,b].

    Section 6.8: Skip completely.

    Topics which may be covered on the comprehensive Final Exam

    Section 7.1: Diagonalization of symmetric matrices. Cover main ideas, examples, but the proof is beyond the scope of this course. The proof that eigenspaces of A = AT from distinct eigenvalues are orthogonal is quite simple and accessible. Gram-Schmidt gives a way to get orthonormal basis for each e-space, so that diagonalization can be achieved by an orthogonal transition matrix P whose inverse is PT. The Spectral Theorem is really advanced linear algebra, and the proof is beyond this course. I might mention it as a very nice result which only applies to symmetric real matrices. I would skip the Spectral Decomposition Theorem.

    Skip Section 7.2 because some instructors did not have time to cover this material. For future reference, details about the topics from Section 7.2 were as follows: Quadratic forms are defined by QA(X) = XT A X for X in Rn where A = AT. We may have time to cover basic properties and the effect of a change of basis X = PY for invertible P, QA(X) = (PY)T A (PY) = YT (PT A P) Y = QB(Y) where B = PT A P is another symmetric matrix representing the same quadratic form. This begs the question: When can we find a nicer B? Can we find a diagonal B? The methods just completed about orthogonal diagonalization of a real symmetric matrix tell us the answer is yes.

    Anything covered after the cutoff date for Exam 3 material.

    Anything from any part of the course.

    This page last modified on 12-17-2019.