- Artificial Neural Network Tutorial
- Artificial Neural Network - Home
- Basic Concepts
- Building Blocks
- Learning & Adaptation
- Supervised Learning
- Unsupervised Learning
- Learning Vector Quantization
- Adaptive Resonance Theory
- Kohonen Self-Organizing Feature Maps
- Associate Memory Network
- Hopfield Networks
- Boltzmann Machine
- Brain-State-in-a-Box Network
- Optimization Using Hopfield Network
- Other Optimization Techniques
- Genetic Algorithm
- Applications of Neural Networks
- Artificial Neural Network Resources
- Quick Guide
- Useful Resources
- Discussion
Brain-State-in-a-Box Network
The Brain-State-in-a-Box (BSB) neural network is a nonlinear auto-associative neural network and can be extended to hetero-association with two or more layers. It is also similar to Hopfield network. It was proposed by J.A. Anderson, J.W. Silverstein, S.A. Ritz and R.S. Jones in 1977.
Some important points to remember about BSB Network −
It is a fully connected network with the maximum number of nodes depending upon the dimensionality n of the input space.
All the neurons are updated simultaneously.
Neurons take values between -1 to +1.
Mathematical Formulations
The node function used in BSB network is a ramp function, which can be defined as follows −
$$f(net)\:=\:min(1,\:max(-1,\:net))$$
This ramp function is bounded and continuous.
As we know that each node would change its state, it can be done with the help of the following mathematical relation −
$$x_{t}(t\:+\:1)\:=\:f\left(\begin{array}{c}\displaystyle\sum\limits_{j=1}^n w_{i,j}x_{j}(t)\end{array}\right)$$
Here, xi(t) is the state of the ith node at time t.
Weights from ith node to jth node can be measured with the following relation −
$$w_{ij}\:=\:\frac{1}{P}\displaystyle\sum\limits_{p=1}^P (v_{p,i}\:v_{p,j})$$
Here, P is the number of training patterns, which are bipolar.