Which of the following are subspaces?

4.9k Views Asked by At

Determine whether the given set $S$ is a subspace of the vector space $V$.

A. $V=\mathbb{R}^{nxn}$, and $S$ is the subset of all $n×n$ matrices with $\det(A)=0$.

B. $V$ is the space of three-times differentiable functions $\mathbb{R}→\mathbb{R}$, and $S$ is the subset of $V$ consisting of those functions satisfying the differential equation $y'''+2y=x^2$.

C. $V=P^3$, and $S$ is the subset of $P^3$ consisting of all polynomials of the form $p(x)=ax^3+bx$.

D. $V=\mathbb{R}^n$, and $S$ is the set of solutions to the homogeneous linear system $Ax=0$ where $A$ is a fixed $m×n$ matrix.

E. $V$ is the vector space of all real-valued functions defined on the interval $[a,b]$, and $S$ is the subset of $V$ consisting of those functions satisfying $f(a)=3$.

F. $V=\mathbb{R}^{nxn}$, and $S$ is the subset of all upper triangular matrices.

G. $V$ is the vector space of all real-valued functions defined on the interval $(−∞,∞)$, and $S$ is the subset of $V$ consisting of those functions satisfying $f(0)=0$.

I have attempted to solve this by using the fact that the zero vector/polynomial has to be in the set so the set is non-empty, check that it is closed under addition and under scalar multiplication. I have found that A,C,D,F and G satisfy these properties, however I am not sure if I am correct and I do not understand how to work with B. I know that E is not a subspace because the zero element is not in the set. Can someone help me? Thank you.

2

There are 2 best solutions below

1
On

You're correct on all the ones you answered. That leaves B. But note that the function $y=0$ does not satisfy the differential equation, so you don't have the zero element in your subspace.

0
On

I'm afraid you got one wrong.

A. False (not closed for sums)

B. False (zero not in the set) Here zero is the zero constant function.

C. True

D. True

E. False (zero not in the set)

F. True

G. True

Why is A false? In the case of $n=2$, consider $$ \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}= \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}+ \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix} $$ and generalize for all $n>1$. On the other hand, A is true for $n=1$, because the only $1\times1$ matrix having zero determinant is the null matrix.