The use of graphical processing units (GPUs) which has the ability to highly parallelise computational tasks has brought the field of neural networks and its corresponding machine learning techniques – which is now around for almost half a century – back to focus.

Several architectures such as a convolutional neual network (CNN) and feed-forward NN allow to estimate a model function f which connects an input-value x with a given ouput y. The input could for example given by a stack of blurred images coming from a microscope, wheres the output comes from deblurred or ground-truth data. A learning procedure, such as the so called back-propagation method tries to minimize a given cost-/error-function – which could eventually be given by the RMS between the output data y and the, by the NN, propagated data y’=F(x). By varying the model-parameters or weights inside the NN, the error reduces.
Therefore a neural network is able to estimate any given function within a given boundary without knowing the function directly.

Applications

The field of the so called “Big Data” tries to find relationships and corellations in a given Dataset.

Skip to toolbar