Boltzmann Machine

A Boltzmann machine is a type of stochastic recurrent neural network invented by Geoffrey Hinton and Terry Sejnowski. Boltzmann machines can be seen as the stochastic, generative counterpart of Hopfield nets. They were one of the first examples of a neural network capable of learning internal representations, and are able to represent and (given sufficient time) solve difficult combinatoric problems. However, due to a number of issues discussed below, Boltzmann machines with unconstrained connectivity have not proven useful for practical problems in machine learning or inference. They are still theoretically intriguing, however, due to the locality and Hebbian nature of their training algorithm, as well as their parallelism and the resemblance of their dynamics to simple physical processes. If the connectivity is constrained, the learning can be made efficient enough to be useful for practical problems.

They are named after the Boltzmann distribution in statistical mechanics, which is used in their sampling function.

Read more about Boltzmann Machine:  Structure, Probability of A Unit's State, Equilibrium State, Training, Problems, Restricted Boltzmann Machine, History

Famous quotes containing the word machine:

    But it is found that the machine unmans the user. What he gains in making cloth, he loses in general power. There should be a temperance in making cloth, as well as in eating.
    Ralph Waldo Emerson (1803–1882)