### Intro to Linear Algebra

(Wee Kean here.) I'd like to give a brief intro to linear algebra and wrap up with some personal experience with learning math.

# Definitions

A vector space $V$ is a set of elements with two operations $\cdot$ and $+$, representing scalar multiplication and addition respectively. They satisfy the following axioms,

$\forall x,y,z \in V, \forall a,b \in \mathbb{R}$

- $x + (y + z) = (x + y) + z$
- $u + v = v + u$
- $\exists \bar{0} \in V, \bar{0} + x = x$
- $\exists -x \in V, (-x) + x = \bar{0}$
- $a(bx) = (ab)x$
- $1 \cdot x = x$
- $a(x+y) = ax + ay$
- $(a+b)x = ax + bx$

In summary, addition is commutative and associative, addition distibutes over multiplication and multiplication distributes over addition like you expect it to. In general, you can replace $\mathbb{R}$ with any field but $\mathbb{R}$ encapsulates most of the intuition.

A familiar example of a vector space would be the set of tuples $(x,y)$ - a 2D vector. Addition works component wise - $(a,b) + (c,d) = (a+c, b+d)$. Multiplication works as expected too - $c \cdot (a,b) = (ca, cb)$.

**Linear Independence**

We say that a set of vectors ${v_1, v_2, ..., v_k}$ are linearly independent if and only if the only solution to the equation

$$ a_1 \cdot v_1 + a_2 \cdot v_2 + ... + a_k \cdot v_k = 0 $$

is $a_1 = a_2 = ... = a_k = 0$.

**Span**

Similarly, the span of a set of vectors ${v_1, v_2, ..., v_k}$ is the set of vectors attainable from a linear combination of these vectors. In other words, the set of vectors $v$ such that

$$ a_1 \cdot v_1 + a_2 \cdot v_2 + ... + a_k \cdot v_k = v $$

for some $a_i \in \mathbb{R}$.

**Basis**

A basis for a vector space $V$ is a set of linearly independent vectors such that the span of these vectors is $V$ itself. For example, ${(0,1),(1,0)}$ is a basis for $\mathbb{R}^2$.

**Dimension**

The dimension of a vector space $V$ is then, the cardinality of a basis of $V$. We can show this is well-defined for vector spaces with a finite basis. The idea is to show the stronger statement: if $U$, $W$ are finite subsets of $V$ such that $U$ spans $V$ and $W$ is linearly independent then $|U| \geq |W|$. Suppose $U = {u_1, u_2, ..., u_n}$, $W = {w_1, w_2, ..., w_m}$, then since $U$ spans $V$, $w_1 = c_1 u_1 + c_2 u_2 + ... + c_n u_n$. Then, WLOG, $c_n$ is non-zero. Then, $u_n = \frac{1}{c_n} w_1 - \frac{c_1}{c_n} v_1 + ... $ hence ${u_1, u_2, ..., u_{n-1}, w_1}$ is spanning. We repeat with $w_2, ..., w_m$, utilizing the fact that some $u_i$ coefficient must be non-zero at each step since $W$ is linearly independent.

# An application

(Iran TST 2012 Q3) Let $n$ be a positive integer. Let $S$ be a subset of points on the plane with these conditions:

i) There does not exist $n$ lines in the plane such that every element of $S$ is on at least one of them.

ii) for all $X \in S$ there exists $n$ lines in the plane such that every element of $S - X $ is on at least one of them.

Find maximum of $|S|$.

Trying some small cases, a good guess is $\frac{(n+1)(n+2)}{2}$, achieved when the points are arranged in a triangular array.

Each line is of the form $a_ix + b_iy + c_i = 0$. Drawing from the construction utilised in lagrange interpolation, if we product these $n$ lines, we get some two variable polynomial which evaluates to 0 at all but one point. In other words, for each point $X_i$ in the set $S$, there exists some two variable polynomial $$P_i(x,y)=\prod\limits_{j=1}^{n} (a_jx+b_jy+c_j) $$ which evaluates to 0 at all points except $X_i$. Then, these polynomials must be linearly independent. But each term in the polynomial has degree less than or equal to $n$. The dimension of the vector space of all two variable polynomials with degree less than or equal to $n$ is $\frac{(n+1)(n+2)}{2}$. Hence, the number of such polynomials and hence the number of points is less than or equal to $\frac{(n+1)(n+2)}{2}$ as needed.

# Rant Warning

I had a brief intro to linear algebra in secondary 2, taught in RI's Euler Programme. I was told that the determinant (whatever that meant) of a 2 by 2 matrix is $ad-bc$ and a unique solution exists iff that quantity was non-zero. I was taught how to perform guassian elimination on a general $n \times n$ matrix and what the row echelon form of a matrix is and how to derive it. Or how to use Cramer's rule to solve a simultaneous equation.

The lasting impression I got was that linear algebra was all about staring at large grids of numbers and following some strict rules of manipulating these numbers to obtain solutions to some simulataneous equations. Needless to say, I was extremely unimpressed. Later on, I realised that a lot of things were vector spaces, not just list of numbers - solutions to linear differential equations formed vectors spaces, solutions to linear recursions formed vector spaces and I realise "ah, there was a lot more to this."

Anyways, if you're in a similar situation, I hope this post resparks your interest!

## Comments

## Post a Comment