A recurrent neural network basically unfolds over time. It is used for sequential inputs where the time factor is the main differentiating factor between the elements of the sequence.
A recursive neural network is more like a hierarchical network where there is really no time aspect to the input sequence but the input has to be processed hierarchically in a tree fashion.
CNNs (convolutional neural networks) use a variation of multilayer perceptrons designed to require minimal preprocessing. They are also known as shift invariant or space invariant artificial neural networks (SIANN), based on their shared-weights architecture and translation invariance characteristics. Convolutional networks were inspired by biological processes in that the connectivity pattern between neurons resembles the organization of the animal visual cortex.
Feedforward Neural Network: This neural network is one of the simplest form neural networks, where the data or the input travels in one direction. The data passes through the input nodes and exits on the output nodes. This neural network may or may not have the hidden layers. It has a front propagated wave and no back propagation by using a classifying activation function usually.
The sum of the products of inputs and weights are calculated and fed to the output. The output is considered if it is above a certain value i.e threshold(usually 0) and the neuron fires with an activated output (usually 1) and if it does not fire, the deactivated value is emitted (usually -1).
Application of Feed forward neural networks are found in computer vision and speech recognition where classifying the target classes are complicated. These kind of Neural Networks are responsive to noisy data and easy to maintain.