Suppose V is a real vector space of finite dimension n. Call the set of matrices from V into itself M(V). Let T be in M(V). Consider the two subspaces U = {X ∈ M(V) : TX = XT} and W = {TX − XT : X ∈ M(V)}. Which of the following must be TRUE? I. If V has a basis containing only eigenvectors of T then U = M(V). II. dim (U) + dim (W) = n^2 III. dim (U) < n
Not the question you are looking for? Ask here!
Enter question by text
Enter question by image
Upgrade to Asksia Pro
Join a AskSia's Pro Plan, and get 24/7 AI tutoring for your reviews, assignments, quizzes and exam preps.