Meeting times: MWF 10:50 - 11:50 AM online using Zoom. The Zoom link is: Math 507 Fall 2020 Zoom Meeting Link

Prof. Feingold's Office: WH-115. Send me an email to get my cell phone number. Office Hours: Online by appointment using Zoom. To schedule an appointment, please send me an email at least 2 hours ahead of time.

Schaum's Outline of Linear Algebra, 6th Edition, by Seymour Lipschutz and Marc Lipson, ISBN-13: 978-1260011449, Publisher: McGraw-Hill Education; 6th edition (October 25, 2017)

We will cover as much of the textbook as time allows.

There will be 10 quizzes, 2 hourly exams and 1 Final Exam, all administered online. The quizzes will be worth 10 points each, hourly exams will be worth 100 points each, and the Final Exam will be worth 200 points. The contents of each exam will be determined one week before the exam. The Final Exam will be comprehensive, covering the whole course. Since this class will be taught online, exams will be given online and may be divided into smaller parts with shorter times for completion. Online exams would be submitted, graded and returned as electronic files. In that case, notes and the textbook will be available during the exam to all students, but no outside help will be allowed. Any student found to have used outside sources of help for graded exams will be subject to the strictest rules of the honesty policy of Binghamton University. ANYONE UNABLE TO TAKE AN EXAM SHOULD CONTACT THE PROFESSOR AHEAD OF TIME TO EXPLAIN THE REASON. PLEASE DON'T MISS THE FINAL!

A schedule of the hourly exams will be posted below. The Final exam is determined by the registrar.

Exam 1: October 9, 2020 (due Oct. 12 by 5:00 PM) Covers all topics taught up until Oct. 5. See list of topics below.

Exam 2: November 30, 2020 (due Dec. 2 by 5:00 PM) Covers all topics taught since Exam 1.

Final Exam: December 7, 2020 (due Dec. 11 by 4:00 PM) Covers the entire course.

Since classes, exams and quizzes are being given online, all submitted work should be submitted as an electronic file (pdf preferred). I highly recommend using CamScanner for this purpose. It is available from this website: CamScanner Website. You can submit documents either as attachments emailed to me, or uploaded to MyCourses, where the assignments will be posted.

I will post links here to pdf files of my lecture notes and to Panopto recordings of my lectures.

Lecture recording and notes from August 26: Lecture Notes pages 1-13 and Panopto recording of Lecture 1 on Aug. 26, 2020.

Lecture recording and notes from August 28: Lecture Notes pages 14-30 and Panopto recording of Lecture 2 on Aug. 28, 2020.

Lecture recording but the notes from August 31 were on pages 15-30 already posted above: Panopto recording of Lecture 3 on Aug. 31, 2020.

Lecture recording and notes from September 2: Lecture Notes pages 31-49 and Panopto recording of Lecture 4 on Sept. 2, 2020.

Lecture recording and notes from September 4: Lecture Notes pages 50-57 and Panopto recording of Lecture 5 on Sept. 4, 2020.

Lecture recording and notes from September 7: Lecture Notes pages 58-70 and Panopto recording of Lecture 6 on Sept. 7, 2020.

Lecture recording and notes from September 9: Lecture Notes pages 71-76 and Panopto recording of Lecture 7 on Sept. 9, 2020.

Lecture recording and notes from September 11: Lecture Notes pages 77-86 and Panopto recording of Lecture 8 on Sept. 11, 2020.

Lecture recording and notes from September 14: Lecture Notes pages 87-97 and Panopto recording of Lecture 9 on Sept. 14, 2020.

Lecture recording and notes from September 16: Lecture Notes pages 98-109 and Panopto recording of Lecture 10 on Sept. 16, 2020.

Lecture recording and notes from September 18: Lecture Notes pages 110-122 and Panopto recording of Lecture 11 on Sept. 18, 2020.

Lecture recording and notes from September 21: Lecture Notes pages 123-137 and Panopto recording of Lecture 12 on Sept. 21, 2020.

Lecture recording and notes from September 23: Lecture Notes pages 138-150 and Panopto recording of Lecture 13 on Sept. 23, 2020.

Lecture recording and notes from September 25: Lecture Notes pages 151-165 and Panopto recording of Lecture 14 on Sept. 25, 2020.

Lecture recording and notes from September 28: Lecture Notes pages 166-184 and Panopto recording of Lecture 15 on Sept. 28, 2020.

Lecture recording and notes from September 30: Lecture Notes pages 185-199 and Panopto recording of Lecture 16 on Sept. 30, 2020.

Lecture recording and notes from October 2: Lecture Notes pages 200-214 and Panopto recording of Lecture 17 on October 2, 2020.

Lecture recording and notes from October 5: Lecture Notes pages 215-225 and Panopto recording of Lecture 18 on October 5, 2020.

Lecture recording and notes from October 7: Lecture Notes pages 226-232 and Panopto recording of Lecture 19 on October 7, 2020.

Lecture recording and notes from October 9: Lecture Notes pages 233-236 and Panopto recording of Lecture 20 on October 9, 2020.

Lecture recording and notes from October 12: Lecture Notes pages 237-248 and Panopto recording of Lecture 21 on October 12, 2020.

Lecture recording and notes from October 14: Lecture Notes pages 249-255 and Panopto recording of Lecture 22 on October 14, 2020.

Lecture recording and notes from October 16: Lecture Notes pages 256-262 and Panopto recording of Lecture 23 on October 16, 2020.

Lecture recording and notes from October 19: Lecture Notes pages 263-274 and Panopto recording of Lecture 24 on October 19, 2020.

Lecture recording and notes from October 21: Lecture Notes pages 275-283 and Panopto recording of Lecture 25 on October 21, 2020.

Lecture recording and notes from October 23: Lecture Notes pages 284-296 and Panopto recording of Lecture 26 on October 23, 2020.

Lecture recording and notes from October 26: Lecture Notes pages 297-317 and Panopto recording of Lecture 27 on October 26, 2020.

Lecture recording and notes from October 28: Lecture Notes pages 318-327 and Panopto recording of Lecture 28 on October 28, 2020.

Lecture recording and notes from October 30: Lecture Notes pages 328-340 and Panopto recording of Lecture 29 on October 30, 2020.

Lecture recording and notes from November 2: Lecture Notes pages 341-349 and Sorry, no Panopto recording of Lecture 30 on November 2, 2020.

Lecture recording and notes from November 4: Lecture Notes pages 350-358 and Panopto recording of Lecture 31 on November 4, 2020.

Lecture recording and notes from November 6: Lecture Notes pages 359-368 and Panopto recording of Lecture 32 on November 6, 2020.

Lecture recording and notes from November 9: Lecture Notes pages 369-378 and Panopto recording of Lecture 33 on November 9, 2020.

Lecture recording and notes from November 11: Lie algebras Lecture Notes pages 1-10 and Panopto recording of Lecture 34 on November 11, 2020.

Lecture recording and notes from November 13: Lecture Notes Lemann_Tensor_Product and Panopto recording of Lecture 35 on November 13, 2020.

Lecture recording and notes from November 16: Lecture Notes Williams Tensors and Trace and Panopto recording of Lecture 36 on November 16, 2020.

Lecture recording and notes from November 18: Lecture Notes Mosbo Linear Regression, Lecture Notes pages 379-381. and Panopto recording of Lecture 37 on November 18, 2020.

Lecture recording and notes from November 20: Lecture Notes pages 382-390 and Panopto recording of Lecture 38 on November 20, 2020.

Lecture recording and notes from November 30: Lecture Notes pages 391-393, Lie algebras Lecture Notes pages 11-26 and Panopto recording of Lecture 39 on November 30, 2020.

Lecture recording and notes from December 2: Lie algebras Lecture Notes pages 27-36 and Panopto recording of Lecture 40 on December 2, 2020.

Lecture recording and notes from December 4: Lie algebras Lecture Notes pages 37-50 and Panopto recording of Lecture 41 on December 4, 2020.

Lecture recording and notes from December 7: Lie algebras Lecture Notes pages 51-59 and Panopto recording of Lecture 41 on December 7, 2020.

To help you prepare for Exams 1 and 2, I may show you my old linear algebra exams which cover material very close to what we have done. To take these as practice exams, only look at the questions. After you have tried the test, you can look at the solutions which are presented at the end.

Here is a list of topics covered in lectures which may be covered on Exam 1, Oct. 9, 2020.

Linear Systems, solving by row reduction of the augmented matrix [A|B] and interpretation in terms of free and dependent variables.

Consistent vs. inconsistent systems. Homogeneous systems AX=O.

Elementary row operations and reduction to Reduced Row Echelon Form (RREF).

Matrices, the set of all mxn matrices with entries in field F, F^{m}_{n}, addition of
matrices, multiplication of a matrix by a scalar in F.

The span of a set of vectors in F^{m}_{n} as the set of all linear combinations
from that set.

Matrix shapes, names of special patterns.

Rank of a matrix.

How an mxn matrix A determines a function L_{A}: F^{n}
--> F^{m} by L_{A}(X) = AX.

Linearity properties of the function L_{A}, that is,
L_{A}(X+Y) = L_{A}(X) + L_{A}(Y) for any X, Y in F^{n},
and L_{A}(rX) = r L_{A}(X) for any X in F^{n} and any r in F.

Definition of Ker(L_{A}) and of Range(L_{A}) = Im(L_{A}) and
how to find them by row reduction methods.

Properties of general functions: one-to-one (injective), onto (surjective), both (bijective), invertible. Composition of functions, associativity of composition.

Connection between properties of matrix A and function L_{A}.

Defintion of matrix multiplication AB through the definition L_{A}
composed with L_{B} equals L_{AB}. Lemma that L_{A}
= L_{B} iff A=B.

Formula for the matrix product of an mxn matrix A with an nxp matrix B giving an mxp matrix
C = AB whose columns are A(Col_{k}(B)) for k = 1, ..., p.

Defintion of standard basis vectors e_{1},
... , e_{n} in F^{n} and lemma that Ae_{j}
= Col_{j}_{}(A), so AX is the sum of x_{j}
Col_{j}(A).

Abstract definition of a real vector space, V. Examples, F^{m}_{n}
is a vector space. For any set S, the set Fun = {f : S ---> F} of
all functions from S to the field F, is a vector space.

Definition of a linear transformation L : V ---> W from a vector space to a vector space. Ker(L), Range(L) = Im(L).

Basic facts about vector spaces and about linear transformations (maps), and examples.

Definition and some examples of subspaces.

Definition of when a square matrix is
invertible, uniqueness of the inverse when it exists, and an
algorithm to decide and find it by row reduction of [A | I_{n}].

Definition of transpose of a matrix, of symmetric and anti-symmetric matrices.

Elementary matrices and how they can be used to achieve elementary row or column operations.

The rules of matrix algebra.

The span of a set of vectors S in a vector space V, and why it forms a subspace of V.

How to check that a subset W in V is a subspace of V.

Linear indepdendence or dependence of a subset of V, definition and method of determining that.

Theorems and examples about spanning and independence, connection with rank of a matrix.

Definition of a basis for a vector space, and how to decide if a subset is a basis of V.

Finding a basis for important examples of subspaces, Ker(L), Range(L), where L:V---> W is a linear map.

Dimension of V as the number of vectors in any basis for V.

The standard basis for several examples of vector spaces, including
all the F^{m}_{n} examples and the vector space of polynomials with
degree at most k.

Row-space and Column-space of a matrix, and their dimension related to the rank of the matrix.

Information about the linear transformation
L_{A}: F^{n}--> F^{m} associated with rank(A).

The relationship between the dimensions of Ker(L), Range(L) and V for L:V---> W.

Extending an independent set to a basis, cutting down a spanning set to a basis.

Use of a basis S of V to give
coordinates with respect to S for each vector v in V. How that coordinate function, [v]_{S},
is a linear map from V to
F^{n} when a basis S for V consists of n vectors.

Transition matrices which give the relationship between the coordinates of a vector
v with respect to different bases. If S and T are two bases of the same vector space, V,
then the transition matrix from S to T is the square invertible matrix _{T}P_{S} such that
[v]_{T} = _{T}P_{S} [v]_{S}.

How to represent a general linear map L:V---> W with respect to a choice of basis S in V and basis T in W by a matrix, that is,

using coordinates
with respect to S, [ . ]_{S}, and coordinates with respect to T, [ . ]_{T}, to find
a matrix _{T}[L]_{S}, such that _{T}[L]_{S} [v]_{S} =
[L(v)]_{T}.

The algorithm for finding that matrix by a row reduction of [T | L(S)].

If S and S' are two bases of V, and T and T' are two bases of W, and L:V---> W then there is a
relationship between _{T}[L]_{S}, the matrix representing L from S to T, and
_{T'}[L]_{S'}, the matrix representing L from S' to T'.

That relationship is _{T'}[L]_{S'} =
_{T'}Q_{T} _{T}[L]_{S} _{S}P_{S'}
where _{S}P_{S'} is the transition matrix from S' to S, and
_{T'}Q_{T} is the transition matrix from T to T'.

Row/Column equivalence of two mxn matrices, B = QAP, for appropriate size invertible matrices Q and P.

Block Identity Form (BIF) as best matrix Row/Column equivalent to a given matrix A, best matrix representing L:V-->W given choice of bases S' and T'.

The concept of isomorphism (bijective linear map) and its properties.

Lin(V,W) = {L:V-->W | L is linear} is a vector space under addition of functions and scalar multiplication of a function.

Isomorphism between F^{m}_{n} and Lin(F^{n},F^{m}) by taking matrix A
to linear map L_{A}.

End(V) = Lin(V,V) as a ring under + and composition, as well as a vector space, making it an algebra.

Polynomial ring F[t] another example of an algebra. Recursive definition of non-negative powers of a square
matrix, A^{n}, and of an L in End(V), L^{n}.

Evaluation of any polynomial f(t) in F[t] at a square matrix A or at an L in End(V), f(A) and f(L).

Concepts and facts about F[t], for example, degree of a nonzero polynomial, Euclidean Algorithm in F[t], root f(a) = 0 for a in F iff there is a linear factor (t-a) in f(t), irreducible polynomials in F[t].

Discussion of when a polynomial f(t) in F[t] is satisfied by a square matrix A, f(A) = 0 matrix, or satisfied by an L in End(V), f(L) = 0 map on V.

Material on determinants, their definition using permutations or by cofactor expansion, their properties, and methods of calculating them (definition by permutations or by cofactor expansions, crosshatching method for matrices of size n = 2 or n = 3 ONLY, using row operations).

The use of determinant to get the characteristic polynomial, det(tI_{n} - A), whose roots
give the eigenvalues of A, and whose expression as a product of powers of distinct linear factors
gives the algebraic multiplicities.

Eigenspaces, their properties, and how to decide if a matrix can be diagonalized or not. Theorems about eigenspaces and diagonalizability.

Independence of the union of bases for distinct eigenspaces.

Geometric multiplicity and its relationship to algebraic multiplicity for each eigenvalue.

L invariant subspace W of V for L in End(V), restriction of L to W, properties of the restricted L.

Classical adjoint of square matrix A, adj(A), and property that A adj(A) = det(A)I_{n} = adj(A) A.

Cayley-Hamilton theorem, characteristic polynomial of A is satisfied by A.

Definition of minimal polynomial of A, m_{A}(t), and its properties and relationship to characteristic
poly of A.

Sums and direct sums of subspaces of V. dim(W_{1} + W_{2}) =
dim(W_{1}) + dim(W_{2}) - dim(W_{1} ∩ W_{2})

Here is a list of topics covered in lectures which may be covered on Exam 2, Nov. 30, 2020.

Generalized eigenspaces for L in End(V) assuming all eigenvalues of L are in the field F.

Primary decomposition theorem, that V is a direct sum of the generalized eigenspaces for L.

L is diagonalizable iff its minimal polynomial is a product of distinct linear factors.

Quotient spaces V/W for W any subspace of V. Definitions and basic theorems, for example,

For L:V-->W, U subspace of V, there exists an induced linear map L^{-}:V/U-->W such that

L equals L^{-} composed with the projection map from V onto V/U iff U is contained in Ker(L).

Definition of when vector spaces are isomorphic. First and Second Isomorphism Theorems.

For L:V-->V, W an L-invariant subspace of V, existence of the induced linear map L^{-}:V/W-->V/W,

and theorem about block upper triangular form of a matrix representing such an L obtained by extending a

basis of W to a basis of V. Applications to characteristic and minimal polynomial of L.

Jordan blocks and Jordan canonical form theorem for L:V-->V with all eigenvalues in the field F.

Correspondence between a basic Jordan block and a special basis of an L-invariant subspace of V

coming from a chain of generalized eigenspaces for L.

The combinatorics of Jordan blocks for a fixed eigenvalue, connection with the partition function p(n).

The meaning of the number of basic Jordan blocks for a fixed eigenvalue, and of the size of the largest block.

Problems about counting all possible Jordan forms when given characteristic and minimal polynomials of L.

Problems about finding a Jordan form basis for L:V-->V when dim(V) is small.

Assuming L:V-->V has characteristic polynomial factored into irreducible polynomials to powers, get results

about Primary Decomposition Theorem, and the Rational Canonical Form (RCF) matrix representing L.

Definition of cyclic subspace Z(v,L) generated by v and L, and of the Companion matrix C(f(t)) of a monic polynomial f(t).

Theorem that the characteristic and minimal polynomials of a Companion matrix C(f(t)) are both equal to f(t).

RCF as a block diagonal form matrix representing L made from Companion matrix blocks, each coming from a cyclic subspace.

The combinatorics of Companion matrix blocks related to the characteristic and minimal polynomials of L.

The standard dot product on R^{n} and its properties: bilinear, symmetric, positive definite.

Definition of length of a vector, ||v||, the Cauchy-Schwarz inequality, definition of angle between vectors using the standard dot product.

Definition of orthogonal (perpendicular) sets of vectors in R^{n}, and of orthonormal sets of vectors.

Theorem: Any orthogonal set of non-zero vectors is independent.

Theorem: With respect to an orthogonal basis S of R^{n}, the coordinates of any vector v with respect to S can be computed using the dot product.

Definition of an orthogonal nxn real matrix, A transpose equals A inverse, A^{T} = A^{-1}.

Theorem: A is an orthogonal matrix iff the set of its columns forms an orthonormal set in R^{n}.

Applications of the standard dot product to geometry in R^{n}, projection maps.

Projection of on vector onto another. Projection of any vector v in R^{n} into a given subspace W.

General solution for Proj_{W}(v) by solving a linear system.

A better solution using projection maps if you have an orthogonal basis of W.

Gram-Schmidt orthogonalization process in R^{n}, a method to convert a basis of subspace W into an orthogonal basis of W.

Normalization to unit vectors then gives an orthonormal basis of W.

Theorems about orthogonal matrices in relation to the standard dot product of R^{n}.

Theorem: For any real symmetric matrix, A, and any two distinct eigenvalues of A, their eigenspaces are orthogonal.

Theorem: For any real symmetric matrix, A, all its eigenvalues are real.

Theorem: For any real symmetric matrix, A, there is an orthogonal matrix, P, such that P^{T}AP = D is diagonal.

Pythagorean Theorem and the Triangle inequality in R^{n}.

Standard dot product in complex n-space C^{n}. Properties: sesquilinear, conjugate symmetric, positive definite.

Hermitian conjugate A^{*} = A^{H}, of a complex matrix, A.

Definitions: A is called Hermitian when A^{*} = A, called skew-Hermitian when A^{*} = -A,

called unitary when A^{*} = A^{-1}.

Theorem: A is unitary iff the set of its columns forms an orthonormal basis of C^{n}.

Theorem: All eigenvalues of a Hermitian matrix are real.

Orthogonal complement ``S^{perp}" of a subset, S, in R^{n} or C^{n}, equals span(S)^{perp}.

Definition of an orthogonal direct sum of subspaces. For any subspace, W, in F^{n}, W + W^{perp} = F^{n}.

For real symmetric matrix, A, the sum of its distinct eigenspaces is an orthogonal direct sum.

Definition: Complex nxn matrix A is called normal when A A^{*} = A^{*} A, that is, A commutes with its conjugate transpose.

Theorems in Section 13.9 about when a complex nxn matrix is diagonalizable.

General real Inner Product Spaces (IPS), definition of any bilinear function ( , ):VxV-->R which is symmetric and positive definite.

Cauchy-Schwarz inequality for general real IPS gives angle between vectors, cos(theta) = (v,w)/(||v||.||w||), where

length of v is ||v||. Distance between vectors v and w is ||v-w||. v and w are orthogonal when (v,w) = 0.

For V with basis S = {v_{1},...,v_{n}}, a bilinear form on V is determined by its matrix w.r.t. S, M_{S} = [(v_{i},v_{j})].

Then (v,w) = [v]_{S}^{T} M_{S} [w]_{S} gives the form for any vectors v and w in terms of their coordinates w.r.t S and M.

The bilinear form is symmetric iff M_{S} = M_{S}^{T} is symmetric.

An nxn real matrix M is positive definite iff X^{T} M X > 0 for any non-zero column vector X in R^{n}.

Examples from spaces of continuous functions {f:[a,b]-->R | f is continuous} where (f,g) is the definite integral from a to b of f(t)g(t).

Theorem: Any subspace of an IPS with inner product ( , ) is also an IPS with the same inner product.

Applications: Get many examples of IPS from subspaces of continuous functions on an interval.

Gram-Schmidt process can be done in any IPS.

If V is a finite dimensional IPS with two bases, S and T, we found the relationship between the matrices,

M_{S} and M_{T} representing ( , ) w.r.t. the two bases, using the transition matrix P = _{S}P_{T}.

That relationship, M_{T} = P^{Transpose} M_{S} P, was defined to be ``congruence".

For a real nxn symmetric positive definite matrix, M, the nicest matrix congruent to M is diagonal, with the eigenvalues of M on the diagonal, all of which must be positive.

We discussed the case of a general bilinear form f(v,w) on a real finite dimensional vector space, and we defined the rank of the form as the rank of any matrix representing it.

We defined when form f is non-degenerate, rank(f) = dim(V), and when it is degenerate, rank(f) < dim(V).

We defined when a bilinear form f on V is alternating, f(v,v) = 0 for all v in V, and when it is skew-symmetric, f(v,w) = -f(w,v).

We discussed the theorem giving a canonical block diagonal form for a matrix M representing an alternating f.

We discussed other theorems from the textbook about symmetric bilinear forms, and the space of all bilinear forms on V.

We defined and discussed a quadratic form q:V-->F and Sylvester's Law of Inertia for real quadratic forms, signature and rank of the form.

We generalized previous results to Hermitian forms f:VxV-->C for V a complex vector space.

Dual space V^{*} = Lin(V,F) was defined and studied. For any basis S of V we defined the dual basis S^{*} of V^{*}.

For V = F^{n} we discussed how to get an isomorphism between V^{*} and F_{n} so that matrix multiplication gave the evaluation of a linear functional on a vector.

Theorem: Let V have basis S = {v_{1},...,v_{n}} with dual basis S^{*} = {f_{1},...,f_{n}} of V^{*}.

Then the coordinate vector [v]_{S} is the column vector [f_{i}(v)]. There is a similar statement for [f]_{S*} for any f in V^{*}.

For two bases S and T of V, with dual basis S^{*} and T^{*} in V^{*}, we found the relationship between the

transition matrices P = _{S}P_{T} and Q = _{S*}P_{T*} is Q = (P^{-1})^{transpose}.

We discussed the double dual space V^{**} and its relationship to V.

We defined and discussed the annihilator of a subset S in V, Ann(S) = S^{0} = {f in V^{*} | f(s) = 0 for all s in S}.

We defined and discussed the transpose of a linear map L:V-->U to be the map L':U^{*}-->V^{*} where L'(f) equals f composed with L for any f in U^{*}.

We proved that transpose map L' is linear, and if A = _{T}[L]_{S} represents L from S to T, and S^{*} and T^{*} are dual bases, and B = _{S*}[L']_{T*} represents L' from T^{*} to S^{*}, then B = A^{transpose}.

We discussed linear operators on an IPS with inner product < u,v> when the field is either R or C as separate cases.

For L:V-->V we defined and discussed the adjoint operator L^{*}:V-->V defined by condition <L(u),v> = <u,L^{*}(v)> for all u, v in V.

We discussed what this means for the matrices representing L and L^{*} w.r.t. a standard basis, and for a general basis of V.

We proved there is always a unique adjoint operator L^{*} determined by L, and discussed its properties in problem 13.5.

We discussed linear functionals in the dual space V^{*} when V is an IPS, and defined a map ^:V-->V^{*} by u^(v) = <v,u> for any u and v in V.

We defined L:V-->V as self-adjoint when L = L^{*}, and similarly for matrices.

For any L:V-->V we found that H = L^{*} composed with L is self-adjoint, and similarly for matrices.

Theorem: If A is an invertible nxn complex matrix, then A^{*} A is positive definite Hermitian.

We defined L:V-->V as skew-adjoint when L^{*} = -L, and similarly for matrices.

On page 381 of our textbook is a table and a theorem relating many of these concepts, and giving useful analogies.

Theorem: If L:V-->V is self-adjoint with distinct eigenvalues, then the corresponding eigenspaces are orthogonal.

For IPS V, an operator L:V-->V is called orthogonal when L^{*} = L^{-1} when F = R, called unitary if F = C.

A theorem giving equivalent conditions when dim(V) is finite is proved in Problem 13.10. A counter example is given when dim(V) is infinite.

I defined the subsets U(n) = {nxn complex unitary matrices}, and O(n) = {nxn real orthogonal matrices}, and left as an exercise to prove they are both groups under matrix multiplication.

We defined and discussed the relationship of unitary equivalence between two nxn complex matrices, a generalization of the orthogonal equivalence relationship between two nxn real matrices.

For V an IPS we defined and discussed when an operator L:V-->V is called positive definite or positive semidefinite.

We discussed diagonalization and canonical forms results for operator L on an IPS V in the cases when F = R or F = C separately.

The Spectral Theorem was given briefly in the textbook at the end of that section.

I presented material about direct products and direct sums indexed by an arbitrary set I.

Theorem: For any field F, F^{I} = {f:I-->F} is a vector space under componentwise + and scalar multiplication.

Definition: The direct sum of F indexed by I is the subspace {f in F^{I} | support(f) is finite}.

I presented a general definition of a direct product of a family of vector spaces {V_{i} | i in I} indexed by any set I.

That direct product forms a vector space under componentwise + and scalar multiplication.

I defined the direct sum of a family of vector spaces {V_{i} | i in I} indexed by any set I as the subspace of functions in the direct product with finite support.

In the case of I = {1,...,m} finite, the direct sum equals the direct product, and is isomorphic to the Cartesian product
V_{1} x ... x V_{m}.

In the case when all V_{i} = V for a fixed vector space, V, we get V^{m} as m-tuples of vectors from V.

We defined map L:V^{m}-->U to be multilinear (m-linear) when it is linear in each input component separately.

We defined a multilinear map as above to be alternating when it is zero whenever two distinct input components are equal vectors.

We discussed the bijection between an mxn matrix over field F and an m-tuple of row vectors from F_{n}.

We applied these concepts to understand det:F_{n}^{n} --> F as the unique multilinear alternating map from
(F_{n})^{n} to F such that det(I_{n}) = 1.

Students presented talks about material on tensor products from Appendix A, and I presented additional material relating multilinear maps on Cartesian products to linear maps on tensor products, as well as some applications.

When I give an exam, I make a graph of the numerical grades, and based on the average and the distribution, I decide what range of scores corresponds to each letter grade. This allows me to give each student a letter grade as well as a number grade, and the Total of all points earned will also be given a letter grade. The letter grades on the exams indicate how a student is doing, and will be taken into consideration in making the curve for the Totals. The course grade will be determined by the curve of Total points earned as well as by the quality of presentations given and of homeworks completed.

For each section of material covered there will be an assignment of problems from the textbook. They will be due one week from the day they are assigned (or the next scheduled class meeting after that if there is a holiday). Late assignments will be accepted at the discretion of the Professor. Assignments will be examined by the professor, and returned with comments. QUESTIONS ABOUT PROBLEMS SHOULD BE ASKED OF THE PROFESSOR AT THE BEGINNING OF CLASS OR IN OFFICE HOURS. Although homeworks will not be precisely graded, the number of homeworks attempted and the quality of the attempts will be considered as a factor in determining your course grade. Collaboration among students on homeworks is reasonable and encouraged, but the solutions turned in should be written in your own words. As in professional collaborations, if the key ideas of a proof were worked out by more than one person, then the paper turned in should state clearly that the results were obtained in collaboration, and those involved should be named to give credit.

CLASS ATTENDANCE IS ABSOLUTELY ESSENTIAL. I hope that I can stimulate your interest and participation in the classroom, so that I am not the only one talking. If you are prepared to talk about some of the material, you may take the floor and do the lecturing. There is no better way of learning material than to teach it yourself to others. This can be done individually or in teams, but it takes some planning to be ready ahead of time. I cannot force you to do this, but if you have any serious interest in an academic career, I strongly recommend this preparation. The theoretical material is rather abstract, and it is necessary to understand the theory in order to do sensible calculations and interpret them correctly. Exams will be a combination of theory questions (proofs) and calculations appropriate for a course of this level. Lectures can be interrupted at any time for questions or comments. At the start of each class be ready to ask questions about homework problems or about the previous lecture.

If classes are presented online, attendance is still absolutely essential, but Panopto recordings of my lectures will be made and links posted on this webpage, along with pdf files of my written lecture notes. Classroom participation is still possible using Zoom, but student presentations would require a method of showing your writing and hearing your voice. Exams may also need to be supplemented by an oral interview component, in order to verify that each student did and understood their problem solutions.

The shift to remote and hybrid teaching due to the COVID-19 pandemic has required that both instructors and students make changes to their normal working protocols for courses. Students are asked to practice extra care and attention in regard to academic honesty, with the understanding that all cases of plagiarism, cheating, multiple submission, and unauthorized collaboration are subject to penalty. Students must properly cite and attribute all sources used for papers and assignments. Students may not collaborate on exams or assignments, directly or through virtual consultation, unless the instructor gives specific permission to do so. Posting an exam, assignment, or answers to them on an online forum (before, during, or after the due date), in addition to consulting posted materials, constitutes a violation of the university's Honesty policy. Likewise, unauthorized use of live assistance websites, including seeking ``expert" help for specific questions during an exam, can be construed as a violation of the honesty policy. All students should be familiar with the University’s Student Academic Honesty Code.

File last modified on 12-7-2020.