## The solvability condition of A x = b when A is not a square matrix

In an elementary linear algebra course, we learned the solvability condition of the linear system

where is an matrix and are vectors.

To mention a few, the followings are the solvability conditions (equivalent conditions) such that the linear system above has a solution:

- determinant of is not zero
- rank of is full
- all eigenvalues of are non zero
- the vector columns (also rows) are independent
- has an inverse
- et cetera

However, I would like to discuss the case when either (#) is not a square matrix or

(##) the rank of is not full (< $latex n$).

There is actually a condition such that the linear system has a solution whenever we have either (#) or (##). The solvability condition is

.

The thing is this condition is not applicable to most problems I have ever had. For instances, I bumped into the following problem.

Let be the following

, and . I want to seek a condition such that the linear system is solvable for . I want to find a condition for and such that there is a solution of .

The statement did not help me at that time and I look for another implication for this statement that is applicable enough to my problem.

**Fundamental Theorem of Orthogonality **(see [1])

Let be any matrices. The row space of is orthogonal to the null space of .

Proof: Suppose is a vector in the nullspace. Then . This equation can be considered as rows of multiplying :

= = .

We can clearly see that a dot product of row 1 and is zero, a dot product of row 2 and is zero, and so on. This means that the dot product of every row of and is zero, which means every row of is orthogonal to , which completes the proof.

**Corrolary 1**

**The column space of is orthogonal to the left null space of .**

Proof: The proof is easy, we just need to consider the transpose of the matrix .

Going back to our original problem, when the vector is in the column space of , the vector will be orthogonal to any vector in the left null space of .

Thus, we have the following solvability condition:

**Solvability condition of for any matrices
**The linear system has a solution if the dot product of and is zero, where is the base vector of the left null space of .

Remark, the above solvability condition is closely related to the linear version of Fredholm solvability condition.

Reference

[1] Strang, G., “Linear Algebra and Its Applications,” Thomson Brooks/Cole

We can view the equation as . In coding theory we say that is a codeword and is a generator matrix of a code . Then the left null space of is the parity check matrix of the code .

Hi pak Bara, nice information. thank you..

A very interesting post Ivanky! How are you at Lippo Karawaci? I hope you survive there well

Hi AJ, I am fine here thank you.. 🙂

Hi! Quick question that’s totally off topic. Do you know how to make your site mobile friendly? My weblog looks weird when browsing from my apple iphone. I’m trying to find a theme or plugin that might be able to correct this problem. If you have any recommendations, please share. With thanks!|