In introductionary Linear Algebra classes, one often has the following problems: let be a real valued matrix, say an orthogonal one, then the eigenvalues are complex numbers of absolute value 1. the only two such values inside are ; hence, most eigenvalues of orthogonal matrices are not elements of . Now, let be a finite-dimensional Euclidean space and an orthogonal map. If one fixes an orthogonal basis of , one obtains a orthogonal matrix which represents . One can talk about complex eigenvalues of , but what about complex eigenvalues of ? What should these be? does not make sense for a complex number , if is a vector space over .
The usual solution to this is to complexify : define , and define an action
this turns into a -vector space. If one identifies by its image under , , then for all , . Now we are left to extend to . It turns out that there is exactly one choice to extend to a -linear map , i.e. that . Namely, one has to define ; this is obviously -linear, whence it suffices to show that :
Now if is a -basis of , it is as well an -basis of ; moreover, . If now is a complex eigenvalue of , then there exists some such that . So is indeed an eigenvalue of . Abusing notation, we say that is an eigenvalue of ; this will always mean that we are talking of . This process is called complexification of and .
But does this generalize? What if is the base field and one has an eigenvalue of the matrix? Can we do the same thing here? And what if and we have an eigenvalue in ? The answer is yes. The idea is as follows. A basis of over is given by , . Hence, we defined , where the first corresponds to 1 and the second to : i.e. should mean . Now has a basis with three elements, so one could define . And for if , , we need an infinite basis and an infinite direct sum.
It would be nice if we could avoid working with bases, both of and of the field extension . This can indeed be done, using the tensor product. We begin with a very abstract defintion.
Let be a ring and -modules. A pair , where is a -module and is -bilinear, is said to be a tensor product of and over if the following universal property holds:
If is any -module and is -bilinear, there exists exactly one homomorphism such that .
Tensor products exist and are unique up to unique isomorphism. More precisely, if and are tensor products of and over , there exists exactly one -isomorphism with .
From now on, we write for and for , , . In case the base is clear, we will drop the subscript.
As we are interested in tensor products of vector spaces over a field, we can be more concrete.
Let and be -vector spaces. Let be a basis of and be a basis of . Then is a basis of . In particular, .
A different interpretation is that is the set of linear combinations of elements of , where the coefficients are elements of . Hence, we extend the range of the coefficients of elements of from to . Every element of can be written in the form with , , .
Now let be a field extension of . Then is a -vector space, whence we can consider the tensor product . As expected, this turns out to be a -vector space with scalar multiplication , . In case , this definition coincides with the natural -vector space structure of .
Let us consider the special case , . Then is a -basis of ; if is an -basis of , then is an -basis of : every element of can be written in the form with . Moreover, is a -basis of . Compare this with the ad-hoc definition of at the beginning of this post.
Now, let us consider what to do with -linear maps , where and are -vector spaces. We begin with a general result on tensor products.
Let be -modules, , and let be -module homomorphisms. Then there exists exactly one -homomorphism with .
Set and define
One quickly checks that is bilinear. Hence, by the definition of the tensor product , there exists exactly one -homomorphism with
Now let us consider -vector spaces , , a -linear map and the identity map . By the theorem, there exists exactly one -linear map
with . But since is , using the -vector space structure of , we obtain , i.e. is -linear.
Finally, let be a -basis of and be a -basis of . Then is as well a -basis of and is as well a -basis of , whence we can consider the matrices and . Write ; then . Now
Therefore, as well.
Hence, the tensor product allows us to describe , as a generalization of the complexification of real vector spaces, in a very clean and abstract manner.
Finally, recall that every field has an algebraical closure , which is unique up to -isomorphism. For -vector spaces , and -linear maps we get -vector spaces , and a -linear map . We have seen that every -basis of resp. is also an -basis of resp. , and that the matrix representation of with respect to the bases equals the one of . Hence, we can not just talk of arbitrary elements of being eigenvalues of matrices over , but also of endomorphisms defined over , by referring to resp. instead.
Comments.