My office is Room 115 in Whitney Hall. My office phone number is 777-2465 and my email address is alex@math.binghamton.edu. My office hours are MWF 12:00 - 1:00 and by appointment.

``Linear Algebra" by Jim Hefferon, available for free download from the following link: ``Linear Algebra" by Jim Hefferon.

Instructions for students to register in Webwork will be posted on the main Math 304 webpage.

The entire book will be covered if time permits. A list of major topics which may be covered is given at the end of this page.

The eight sections of this course will be run separately by each instructor. Only the Final Exam will be common for all sections. This page contains details relevant only to my Section 6, but some general advice probably applies to everyone. There will be quizzes administered in class for 20% of your course grade. Homework done and evaluated through Webwork will count for 5% of your course grade. Three 90-minute exams will be administered on announced dates during a normal class time. There will be one common Final Exam (2 hours long) during the scheduled Finals period. Each 90-minute exam will be worth 15%, and the Final Exam will be worth 30%. The material being tested in each exam will be determined and announced approximately one week before the exam. The Final Exam will be comprehensive, covering the whole course. ANYONE UNABLE TO TAKE AN EXAM SHOULD CONTACT THE INSTRUCTOR AHEAD OF TIME TO EXPLAIN THE REASON. A MESSAGE CAN BE LEFT AT THE MATH DEPT OFFICE (777-2147) OR ON PROFESSOR FEINGOLD'S VOICEMAIL (777-2465). NO ONE SHOULD MISS THE FINAL!

Information about the scheduling of exams is posted on the main Math 304 page.

After each Exam is graded and returned, solutions will be posted here, along with a letter grade interpretation of the numerical score.

Any student with a special problem or a finals conflict must contact the instructor (me) as soon as possible to make arrangements.

Practice problems of various kinds will be posted on the main Math 304 webpage.

March 16 Lecture (last day in classroom): March 16 Lecture

March 20 Lecture: March 20 Lecture and 3-20 written lecture notes.

March 25 Lecture: March 25 Lecture and 3-25 written lecture notes.

March 27 Lecture: March 27 Lecture and 3-27 written lecture notes.

March 30 Lecture: March 30 Lecture and 3-30 written lecture notes.

April 1 Lecture: April 1 Lecture and 4-1 written lecture notes.

April 3 Lecture: April 3 Lecture and 4-3 written lecture notes.

April 13 Lecture: April 13 Lecture (Access Password: r3#&6mle) and 4-13 written lecture notes.

April 15 Lecture: April 15 Lecture (Access Password: K1+%j1.4) and 4-15 written lecture notes.

April 17 Lecture: April 17 Lecture and 4-17 written lecture notes.

April 18: Solution to Webwork problem requested at the end of class on April 17: Webwork Problem Solution 4-18.

April 20 Lecture: April 20 Lecture and 4-20 written lecture notes.

April 22 Lecture: April 22 Lecture and 4-22 written lecture notes.

April 27 Lecture: April 27 Lecture and 4-27 written lecture notes.

April 29 Lecture: April 29 Lecture and 4-29 written lecture notes.

May 1 Lecture (review for final exam): May 1 Lecture

May 4 Lecture (review for final exam): May 4 Final Exam Review Session and 5-4 written review notes.

A practice exam 1 and its solutions can be downloaded as a pdf file from this link: Practice Exam 1 and its solutions.

The following file contains a summary of results presented in class and needed for exams. Reading this file is not a replacement for attending class, but could be helpful if you miss some classes because of illness. Here is the link to it: Math304-6 Topics Summary.

A corrected (typographical error in 4(b)) practice exam 2 and its solutions can be downloaded as a pdf file from this link: Practice Exam 2 and its solutions.

The numerial score on each exam will be given a letter grade interpretation, giving each student a letter grade as well as a number grade, and the Total of all points earned will also be given a letter grade interpretation. The letter grades on the exams indicate how a student is doing, and will be taken into consideration in making the interpretation for the Totals. The course grade will be determined by the interpretation of Total points earned. Only borderline cases may be subject to further adjustment based on homework, classroom participation and attendance, as determined by the instructor. Any cases of cheating will be subject to investigation by the Academic Honesty Committee of Harpur College.

Class attendance is required at all scheduled meetings, and sleeping in class does not count as being there. Questions are welcomed at any time during a lecture. At the start of each class be ready to ask questions about homework problems or about the previous lecture. We want to create an atmosphere where you all feel very free to ask questions and make comments. If anyone feels that the instructor has not answered a question clearly, completely, and with respect and consideration for the student who asked it, please let your instructor know about it immediately so he/she can correct the problem. You can do this in class or in office hours, verbally or in writing, on paper or by email, or by whatever means makes you most comfortable, but with enough detail that your instructor can understand what you think was done wrong. It will be too late to help if you only complain at the end of the course. If you are not satisfied by the response of your instructor, please contact the course coordinator, Prof. Alexander Borisov.

The material is a combination of theory and calculation, and it is necessary to understand the theory in order to do sensible calculations and interpret them correctly. There is a significant difference between training and education, and we feel strongly that our goal at this university is to educate you, not just to train you to do computations. Theory is not presented to impress you with our knowledge of the subject, but to give you the depth of understanding expected of an adult with a university education in this subject. Some of your instructors have many years of experience teaching mathematics at the university level, but it will require your consistent concentrated study to master this material. While much learning can take place in the classroom, a significant part of it must be done by you outside of class. Using the book, class notes, homework exercises, only you can achieve success in this course. Students who do not take this course seriously, who do not take this advice, are not likely to be rewarded at the end. We are here to help and guide you, and we also grade the exams to judge how much you have learned, but grades are earned by you, not given by us. Exams will be a combination of theory questions and calculations appropriate for a course of this level.

For each section of material covered there will be an assignment of problems from the textbook. Homework will be handled through the online system Webwork, and information about how to access it has been provided on the main Math 304 webpage. The homework counts as 5% of your course grade.

Extra topics if time allows (usually not enough time for these in the elementary linear course) :

These paragraphs will be updated before each exam is given.

Here is a list of topics covered in lectures which may be covered on Exam 1, Feb. 19, 2020.

Linear Systems, solving by row reduction of the augmented matrix [A|B] and interpretation in terms of free and dependent variables.

Consistent vs. inconsistent systems. Homogeneous systems AX=O.

Elementary row operations and reduction to Reduced Row Echelon Form (RREF).

Matrices, the set of all mxn real matrices, R^{m}_{n}, addition of
matrices, multiplication of a matrix by a real number (scalar).

The span of a set of vectors in R^{m}_{n} as the set of all linear combinations
from that set.

Matrix shapes, names of special patterns.

Rank of a matrix.

How an mxn matrix A determines a function L_{A}: R^{n}
--> R^{m} by L_{A}(X) = AX.

Linearity properties of the function L_{A}, that is,
L_{A}(X+Y) = L_{A}(X) + L_{A}(Y) for any X, Y in R^{n},
and L_{A}(rX) = r L_{A}(X) for any X in R^{n} and any r in R.

Definition of Ker(L_{A}) and of Range(L_{A}) = Im(L_{A}) and
how to find them by row reduction methods.

Properties of general functions: one-to-one (injective), onto (surjective), both (bijective), invertible.

Connection between properties of matrix A and function L_{A}.

Defintion of matrix multiplication AB through the definition L_{A}
composed with L_{B} equals L_{AB}. Lemma that L_{A}
= L_{B} iff A=B.

Formula for the matrix product of an mxn matrix A with an nxp matrix B giving an mxp matrix
C = AB whose columns are A(Col_{k}(B)) for k = 1, ..., p.

Defintion of standard basis vectors e_{1},
... , e_{n} in R^{n} and lemma that Ae_{j}
= Col_{j}_{}(A), so AX is the sum of x_{j}
Col_{j}(A).

Definition and properties of the standard dot product in R^{n}: bilinear, symmetric, positive definite.

Definition of length of a vector, ||v||, for v in R^{n}.

Definition of distance between two vectors, ||u-v||.

Definition of the angle a between two
vectors u and v in R^{n} given by the formula cos(a) = u.v/(||u||)(||v||).

Definition of two vectors in R^{n} being orthogonal (perpendicular) when the angle
between them is a right angle (90 degrees = pi/2 radians), so u.v = 0.

The Cauchy-Schwarz inequality |u.v| <= (||u||)(||v||) and the triangle inequality ||u+v|| <= ||u||+||v||.

Topics that have been covered since Exam 1 and which may appear on Exam 2, March 18, 2020, are listed below.

Abstract definition of a real vector space, V. Examples, R^{m}_{n}
is a vector space. For any set S, the set F = {f : S ---> R} of
all functions from S to the reals R, is a vector space.

Definition of a linear transformation L : V ---> W from a vector space to a vector space. Ker(L), Range(L) = Im(L).

Basic facts about vector spaces and about linear transformations (maps), and examples.

Definition and some examples of subspaces.

Definition of when a square matrix is
invertible, uniqueness of the inverse when it exists, and an
algorithm to decide and find it by row reduction of [A | I_{n}].

Definition of transpose of a matrix, of symmetric and anti-symmetric matrices.

Elementary matrices and how they can be used to achieve elementary row or column operations.

The rules of matrix algebra.

The span of a set of vectors S in a vector space V, and why it forms a subspace of V.

How to check that a subset W in V is a subspace of V.

Linear indepdendence or dependence of a subset of V, definition and method of determining that.

Theorems and examples about spanning and independence, connection with rank of a matrix.

Definition of a basis for a vector space, and how to decide if a subset is a basis of V.

Finding a basis for important examples of subspaces, Ker(L), Range(L), where L:V---> W is a linear map.

Dimension of V as the number of vectors in any basis for V.

The standard basis for several examples of vector spaces, including
all the R^{m}_{n} examples and the vector space of polynomials with
degree at most k.

Row-space and Column-space of a matrix, and their dimension related to the rank of the matrix.

Information about the linear transformation
L_{A}: R^{n}--> R^{m} associated with rank(A).

The relationship between the dimensions of Ker(L), Range(L) and V for L:V---> W.

Extending an independent set to a basis, cutting down a spanning set to a basis.

How to represent a general linear map L:V---> W with respect to a choice of basis S in V and basis T in W by a matrix, that is,

using coordinates
with respect to S, [ . ]_{S}, and coordinates with respect to T, [ . ]_{T}, to find
a matrix _{T}[L]_{S}, such that _{T}[L]_{S} [v]_{S} =
[L(v)]_{T}.

The algorithm for finding that matrix by a row reduction of [T | L(S)].

If S and S' are two bases of V, and T and T' are two bases of W, and L:V---> W then there is a
relationship between _{T}[L]_{S}, the matrix representing L from S to T, and
_{T'}[L]_{S'}, the matrix representing L from S' to T'.

That relationship is _{T'}[L]_{S'} =
_{T'}Q_{T} _{T}[L]_{S} _{S}P_{S'}
where _{S}P_{S'} is the transition matrix from S' to S, and
_{T'}Q_{T} is the transition matrix from T to T'.

The concept of isomorphism (bijective linear map) and its properties.

Material on determinants, their definition using permutations or by cofactor expansion, their properties, and methods of calculating them (definition by permutations or by cofactor expansions, crosshatching method for matrices of size n = 2 or n = 3 ONLY, using row operations).

The use of determinant to get the characteristic polynomial, det(tI_{n} - A), whose roots
give the eigenvalues of A, and whose expression as a product of powers of distinct linear factors
gives the algebraic multiplicities.

Eigenspaces, their properties, and how to decide if a matrix can be diagonalized or not. Theorems about eigenspaces and diagonalizability.

Independence of the union of bases for distinct eigenspaces.

Geometric multiplicity and its relationship to algebraic multiplicity for each eigenvalue.

Use of a basis S of V to give
coordinates with respect to S for each vector v in V. How that coordinate function, [v]_{S},
is a linear map from V to
R^{n} when a basis S for V consists of n vectors.

Transition matrices which give the relationship between the coordinates of a vector
v with respect to different bases. If S and T are two bases of the same vector space, V,
then the transition matrix from S to T is the square invertible matrix _{T}P_{S} such that
[v]_{T} = _{T}P_{S} [v]_{S}.

Definition of positive powers of a square matrix A, and positive powers of L: V ---> V.

Orthogonal and orthonormal subsets, projection of a vector onto another vector,
Proj_{v}(u) = (u.v)/(v.v) v.
The geometrical meaning of projection may be helpful but I would not test it on an exam.
Skip decomposing forces.

The projection ṽ = Proj_{W}(v) of vector v into subspace W of R^{n} defined to be
a vector in W such that v - ṽ is orthogonal to W. If T = {w_{1}, ..., w_{m}} is any basis of W,
then ṽ = ∑_{i=1}^{m} x_{i} w_{i} must satisfy (v - ṽ).w_{j} = 0 for
j = 1,..., m, that is the linear system
∑_{i=1}^{m} x_{i} (w_{i}.w_{j}) = v.w_{j} for j = 1,..., m.
Define the mxm symmetric matrix A = [w_{i}.w_{j}] and the mx1 column matrix B = [v.w_{j}]. This system
is just AX = B, and it can be shown that A is invertible, so it always has unique solution X = A^{-1} B
which gives the projection.
When basis T is orthogonal, the matrix A is diagonal and the solution is easy giving an explicit formula
Proj_{W}(v) = ∑_{i=1}^{m} (v.w_{i})/(w_{i}.w_{i}) w_i
which is even simpler with T orthonormal.
The Best Approximation Theorem, which says that ṽ = Proj_{W}(v) is the unique
vector in W such that ||v - ṽ|| < ||v - w|| for any w in W distinct from ṽ.

Anything covered after the cutoff date for Exam 3 material.

Anything from any part of the course.

This page last modified on 8-23-2021.