1. Overview
In this tutorial, we’ll talk about Recurrent Neural Networks (RNNs), which are one of the most successful architectures for processing sequential data. First, we’ll describe their basic architecture and differences from traditional feedforward neural networks. Then, we’ll present some types of RNNs, and finally, we’ll move into their applications.
2. Basic Architecture
First, let’s discuss the basic architecture of an RNN to get a deeper knowledge of why these networks perform so well on sequential data.
Every RNN consists of a series of repeating modules that are called cells and process the input data sequentially. That means that the first cell takes as input the first sample of the sequence, the second cell takes the second sample, and so on. Each cell takes the input vector
and, after some processing, generates a vector known as a hidden state
that is passed to the next cell
. That means that each time the hidden state
captures all the information given so far, enabling the network to have some memory.
In the image below, we can see a basic diagram that illustrates the basic architecture of an RNN:

3. Difference with Traditional Networks
To better understand the RNN architecture, we should investigate its major differences from the traditional feedforward neural networks.
The key difference between these two architectures is that RNNs contain a continuous loop in the network that enables the input sequence to flow through the layers of the network many times. This characteristic enables RNNs to be very effective in processing sequential data, where an important part is to keep track of the ‘past’ of the sequence using some memory. The necessary memory block is represented by the hidden states that are used in the processing of the next inputs.
In the image below, we can see the two architectures that we compared where the ‘loop’ of the RNN is illustrated:

4. Types of RNNs
Throughout the years, many types of RNNs have been proposed to tackle the challenges that exist in RNNs. Now, let’s discuss the two most commonly used variations of RNNs.
4.1. LSTM
When an RNN processes very long sequences, the problem of vanishing gradients appears, meaning that the gradients of the loss function approach zero, making the network hard to train.
A Long Short-Term Memory Network (LSTM) is a variation of an RNN specifically designed to deal with the problem of vanishing gradients. It uses a memory cell that is able to maintain useful information for a long period of time without significantly decreasing the gradients of the network.
4.2. GRU
Another common architecture is the Gated Recurrent Unit (GRU) which is similar to LSTMs but is much simpler in its structure and significantly faster when computing the output.
5. Applications
Now that we have gained an understanding of the architectures of RNNs, we’ll talk about their most common applications of them.
5.1. Natural Language Processing
The most common application of RNNs is in processing text sequences. For example, we can use an RNN to generate text by predicting each time the next word of a text sequence. This task can be proved helpful for chatbot applications and online language translators.
5.2. Series Forecasting
Another useful application of RNNs is in time series forecasting, where our goal is to predict some future outcomes given the previous ones. For example, in weather forecasting, we can use an RNN to explore historical patterns in previous weather data and employ them to predict the weather in the future better.
5.3. Video Processing
Videos can be considered as another type of sequence since they are sequences of image frames. So, RNNs can also be used in video processing tasks like video classification, where our goal is to classify a given video into a category.
5.4. Reinforcement Learning
Finally, RNNs are also used in cases where we want to make decisions in a dynamic environment, like in reinforcement learning. Specifically, we can control a robot using an RNN that maintains the current state of the environment in the RNN memory.
6. Conclusion
In this article, we presented RNNs. First, we talked about their architecture and their differences from traditional networks. Then, we briefly discussed two types of RNNs and some of their applications.