## Inclusion and exclusion in linear spaces

Inclusion and exclusion in linear spaces

This can actually be generalized to the $aA + bB$ case. I will not provide a proof here, as it is quite straight-forward, but I would like to state an easy and interesting corollary:

Thm1. If $A \in U$ and $B \not\in U$, then $aA + bB \not\in U$.

This can be proved by contradiction.

With this under belt, we can prove the following theorem:

Thm2. Every linear space of n-vectors N has a basis.

Proof outline:

• $E_n = (e_1, e_2, \cdots, e_n)$ spans $N$
• The $N = R^n$ case is trivial, henceforth we assume $N \subset R^n$
• Then not all of $E_n$ is in $N$, otherwise we would have $N = R^n$
• Let's divide $E_n$ into two disjoint parts: $E_k = (e_1, \cdots, e_k)$ containing all $e_i$ in $N$ and $E_k' = (e_{k+1}, \cdots, e_n)$ containing the rest.
• $N \subset R^n = (s | a_i \in R) = (a_1e_1 + \cdots + a_ke_k + a_{k+1}e_{k+1} + \cdots + a_ne_n)$
• Let $T = (s | \text{at least one of } (a_{k+1} \cdots a_n) \text{ is not zero})$, then according Thm1 $N \cap T = \emptyset$, or $N-T = N$. Then if you subtract $T$ from both sides from the eq in the previous step, we have \begin{align*} N &\subset R^n - T \\ &= (s | a_i \in R) \\ &= (a_1e_1 + \cdots + a_ke_k + 0 + \cdots) \\ &= (a_1e_1 + \cdots + a_ke_k) \\ \end{align*} which is to say $(e_1, \cdots, e_k)$ spans $N$.
• But the set $(e_1, \cdots, e_k)$ is linearly indepent, hence we have found a basis for $N$