Please enable JavaScript.
Coggle requires JavaScript to display documents.
The Boltzmann Machine by Geoffrey Hinton (1984) ((binary stochastic…
The Boltzmann Machine by Geoffrey Hinton (1984)
The stochastic dynamics
For example, the quadratic energy function in (4) can be replaced
by an energy function that has typical term sisj skwij k
conditional Boltzmann machine
-
binary stochastic units
used in Boltzmann machines can be generalized to “softmax” units
that have more than two discrete values
, Gaussian units whose output is simply their total input plus
Gaussian noise, binomial units, Poisson units, and any other type of unit that falls in the exponential
family (Welling et al. 2005)
This family is characterized by the fact that the adjustable parameters
have linear effects on the log probabilities
a network of symmetrically connected, neuron-like units that make
stochastic decisions about whether to be on or off
It achieves both the settling and the learning using spiking neurons which, over a period of a few milliseconds, have a state of 1 or 0.
Boltzmann machines were designed to model both the settling and the learning and were based on two seminal ideas that appeared in 1982
These neurons have intrinsic noise caused by the quantal release of vesicles of neurotransmitter at the synapses between the neurons
Restricted Boltzmann Machines
consists of a layer of visible units and a
layer of hidden units with no visible-visible or hidden-hidden connections.
-
Learning Deep Networks
After learning one hidden layer, the activity vectors of the hidden units, when they are being driven
by the real data, can be treated as “data” for training another restricted Boltzmann machine
the whole network can be viewed as a single, multilayer generative model, and each
additional hidden layer improves a lower bound
Learning one hidden layer at a time is a very effective way to learn deep neural networks with
many hidden layers and millions of weights
-