## 1. Overview

Matrix multiplication is an important operation in mathematics. It is a basic linear algebra tool and has a wide range of applications in several domains like physics, engineering, and economics.

**In this tutorial, we’ll discuss two popular matrix multiplication algorithms: the naive matrix multiplication and the Solvay Strassen algorithm.** We’ll also present the time complexity analysis of each algorithm.

## 2. Introduction to Matrix Multiplication

We’re taking two matrices and of order and . Let’s take a look at the matrices:

,

Now when we multiply the matrix by the matrix , we get another matrix – let’s name it . The order of the matrix would be .

Let’s now look into elements the matrix :

.

Each entries in the matrix can be calculated from the entries of the matrix and by finding pairwise summation:

## 3. Matrix Multiplication Properties

Let , and be three matrices of the same dimensions.

According to the associative property in multiplication, we can write . **This property states that we can change the grouping surrounding matrix multiplication, and it’ll not affect the output of the matrix multiplication.**

**We can distribute the matrices in a similar way we distribute the real numbers.** Using distributive property in multiplication we can write: .

There are some special matrices called an identity matrix or unit matrix which has in the main diagonal and elsewhere. Some examples of identity matrices are:

, ,

**There is a very interesting property in matrix multiplication. When a **** matrix is multiplied on the right by a **** identity matrix, the output matrix would be same as **** matrix. This property is called multiplicative identity.** For example:

It is important to note that matrix multiplication is not commutative. Suppose we multiply two matrices and of the same order then . This is the general case. But if and both are diagonal matrix and have the same dimensions, they hold the commutative property.

## 4. The Naive Matrix Multiplication Algorithm

Let’s see the pseudocode of the naive matrix multiplication algorithm first, then we’ll discuss the steps of the algorithm:

```
algorithm NaiveMatrixMultiplication(S, P, A, B, G, H):
// INPUT
// S = Matrix of size AxB
// P = Matrix of size GxH
// OUTPUT
// Q = The matrix product of S and P
if B = G:
Q <- make a new matrix of size AxH
for m <- 0 to A - 1:
for r <- 0 to H - 1:
Q[m, r] <- 0
for k <- 0 to G - 1:
Q[m, r] = Q[m, r] + S[m, k] * P[k, r]
return Q
else:
return "Incompatible dimensions"
```

The algorithm loops through all entries of and , and the outermost loop fills the resultant matrix .

To find an implementation of it, we can visit our article on Matrix Multiplication in Java.

### 4.2. Time Complexity Analysis

The naive matrix multiplication algorithm contains three nested loops. For each iteration of the outer loop, the total number of the runs in the inner loops would be equivalent to the length of the matrix. Here, integer operations take time. **In general, if the length of the matrix is** , **the total time complexity would be** .

Now the question is, can we improve the time complexity of the matrix multiplication? We’ll discuss an improved matrix multiplication algorithm in the next section.

## 5. The Solvay Strassen Algorithm

### 5.1. Algorithm

In the year 1969, Volker Strassen made remarkable progress, proving the complexity was not optimal by releasing a new algorithm, named after him.

Where the naive method takes an exhaustive approach, the Stassen algorithm uses a divide-and-conquer strategy along with a nice math trick to solve the matrix multiplication problem with low computation.

Let’s take two input matrices and of order . In general, the dimension of the input matrices would be :

,

**First step is to divide each input matrix into four submatrices of order** :

**Next step is to perform** **10** **addition/subtraction operations:**

**The third step of the algorithm is to calculate** **7** **multiplication operations recursively using the previous results.** Here each is of size :

**Finally, the desired submatrices of the resultant matrix **** can be calculated by adding and subtracting various combinations of the **** submatrices:**

Now let’s put everything together in matrix form:

So as we can see, this algorithm needs to perform multiplication operations, unlike the naive algorithm, which needs multiplication operations.

**It is important to note that this algorithm works only on square matrices with the same dimensions.**

### 5.2. Time Complexity Analysis

This solution is based on recursion. In the first step, we divide the input matrices into submatrices of size . This step can be performed in times.

In step , we calculate addition/subtraction operations which takes time.

In step , we make recursive calls to calculate to . The output of this step would be matrix of order . This step takes time.

Finally, by adding and subtracting submatrices of , we get our resultant matrix . The time complexity of this step would be .

**Therefore the total time complexity of this algorithm would be:**

## 6. Comparison Between Two Algorithms

Let’s summarize two matrix multiplication algorithms in this section and let’s put the key points in a table:

## 7. Conclusion

In this tutorial, we’ve discussed two algorithms for matrix multiplication: the naive method and the Solvay Strassen algorithm in detail.

We also presented a comparison including the key points of these two algorithms.