If M is square, as it usually is, then the diagonal is unchanged. This means the trace is unchanged. In fact, the determinant is unchanged, hence the norm is unchanged.
Subtract s from the main diagonal and take the determinant again. The resulting polynomial is the same, for M and MT, hence the eigen values are the same, including their multiplicities.
The conjugate of a complex matrix is the conjugate of all its entries. The tranjugate is the transpose of the conjugate. This is written M*. Note that M* = MT when M is real.
A symmetric or hermitian matrix has M = M*. (We usually use the word symmetric when M is real, hermitian when M is complex.)
A skew symmetric or skew hermitian matrix has M = -M*.
The diagonal of a hermitian matrix is real, whereas the diagonal of a skew hermitian matrix is pure imaginary.
As an exercise, show that every matrix is a unique sum of a hermitian matrix and a skew hermitian matrix.
Show that (AB)T = BTAT, and (AB)* = B*A*. If A and B are inverses, write AB = 1 and take the tranjugate of everything. This shows B* and A* are inverses. In other words, the inverse of the tranjugate is the tranjugate of the inverse.
Let the vectors x and y be drawn from an arbitrary real or complex vector space S, with a valid dot product. The function f, from S into S,is hermitian, if f(x).y = x.f(y) for all vectors x and y.
Let's prove that every hermitian operator is linear. You will need to know that the space perpendicular to the space perpendicular to S is S. This is pretty obvious, but you can review the proof here.
Let x y and z be vectors and let c be a scalar. Watch what happens to f(cx).
f(cx).y =
cx.f(y) =
c×(x.f(y)) =
c×(f(x).y) =
cf(x).y
If y is perpendicular to f(x), with a zero dot product, then y is perpendicular to f(cx). This holds for all vectors y in the subspace P perpendicular to f(x). Therefore f(cx) is perpendicular to P, and is a multiple of f(x).
For some scalar b, f(cx) = bf(x). Write f(cf(x)).x = bf(f(x)).x = bf(x).f(x). At the same time, f(cf(x)).x = cf(x).f(x). If f(x) is 0 then bf(x) = 0, so we may as well write f(cx) = cf(x). When f(x) is nonzero f(x).f(x) is also nonzero. Hence b = c, and f commutes with scaling by a constant.
Now show f commutes with addition.
f(x+y).z =
(x+y).f(z) =
x.f(z) + y.f(z) =
f(x).z + f(y).z
If d = f(x+y)-(f(x)+f(y)), then d.z = 0 for every vector z. The only vector that is perpendicular to everything is 0, hence f(x+y) = f(x)+f(y), and f is linear.
If f and g are hermitian, show that c×f is hermitian, and f+g is hermitian. They both satisfy the dot product formula. Therefore the set of hermitian operators forms a vector space.
Let M be a hermitian matrix and consider x*M.y, where x is a row vector and y is a column vector. Here x*M runs x through the matrix M, thus implementing f(x). This result is dotted with y. Yet this is the same as multiplying by the column vector z, where z is the conjugate of y. So f(x).y = x*M*z.
Next evaluate x.f(y). Remember that f(y) is y*M, where y has become a row vector on the left. The result, y*M, becomes the second operand of the dot product. In other words, we have x.(y*M). Using matrix notation, this is x times the conjugate of y*M. Since M is hermitian, replace the conjugate of y*M with M*z, where z is the conjugate of y. Thus we obtain the same formula as before: x*M*z. A hermitian matrix implements a hermitian operator.
For the converse, let the matrix M implement a hermitian operator, and let the ith component of x and the jth component of y equal 1. In other words, x and y are basis vectors. Now x*M.y pulls out Mi,j, while x.(y*M) pulls out the conjugate of Mj,i. These are equal, hence Mi,j is always the conjugate of Mj,i, and M is a hermitian matrix.
Review the above proofs. A skew hermitian operator is linear, and the set of skew hermitian operators on S forms a vector space. When S is finite dimensional, f is skew hermitian iff it is implemented by a skew hermitian matrix.
Next let S be the space of differentiable functions that are equal at the endpoints of [a,b]. That is, f(a) = f(b). Note that S is still a vector space. Also, the dot product f.g = ∫f×g is still well defined.
Let the operator d(f) = f′. In other words, d(f) takes the derivative of f. This is a linear operator. But is it symmetric?
Remember that the dot product is the integral of f×g, so d(f).g = ∫f′g. Use integration by parts; this is the same as fg -∫fg′. Yet the function f×g is equal at the endpoints of the interval, so that term drops out. We are left with -∫fg′, which is the same as -f.d(g). Therefore d(f).g = -f.d(g), and differentiation is a skew symmetric operator.