1. Introduction

In this tutorial, we’ll study the nonlinear activation functions most commonly used in backpropagation algorithms and other learning procedures.

The reasons that led to the use of nonlinear functions have been analyzed in a previous article.

2. Feed-Forward Neural Networks

Backpropagation algorithms operate in fully interconnected Feed-Forward Neural Networks (FFNN):

ANN feed forward 1

with units that have the structure:

Artificial neuron 1

The \phi function performs a transformation of the weighted sum of the inputs:

    [\Sigma=w_{0}+\sum_{i=1}^{N}w_{i}x_{i}]

We discuss the FFNNs in more detail in our linear models article.