The dynamics of neural networks and the dynamics of classical spin systems
are strikingly similar in several respects. The storage of information
(firing patterns) or of stimulus-response schemes in neural networks,
for instance, can formally be described as the construction of attractors
in the dynamics of spin systems. This construction is achieved by giving
suitable values to the exchange couplings between the spins, which take the role of
synapses in neural networks. The excitatory or inhibitory nature of synapses
is in this construction corresponds to a ferromagnetic or anti ferromagnetic
nature of the spin-spin interaction. The search for suitable values of synaptic
interactions in a neural net can be done iteratively through a process of
learning or training.
In the course of the last ten years or so, we have adressed questions
concerned with the storage capacity of neural networks, questions related
to neural code (low-activity and hierarchically organized patterns), with
the storage and representation of sequences, with analog or graded-response
neuron systems, with learning algorithms, with preprocessing in feed-forward
systems, with the role of dream-sleep (unlearning), and more.
Recent talks and a list of publications can be found below.
Learning with incomplete information in the Committee Machine
U.M. Bergmann, R. Kühn and und I.O. Stamatescu,
final version (pdf), Biol. Cyb. 99 401-410 (2009)
Learning with incomplete information and the mathematical
structure behind it
R. Kühn and und I.O. Stamatescu,
Biol. Cyb. 97 , 99-112 (2007) .
Representation and Coding of Stimuli by a Population of Neurons I:
The Stationary Regime
preprint, submitted to J. Comp. Neurosci.
Adaptivity and Learning - An Interdisciplinary Debate,
edited by R. Kühn, R. Menzel, W. Menzel, U. Ratsch, M.M. Richter,
and I.O. Stamatescu, (Springer, Berlin, 2003)
Learning Structured Data from Unspecific Reinforcement,
M. Biehl, R. Kühn and und I.O. Stamatescu,
J. Phys. A 33 , 6843-6857 (2000)
A Two Step Algorithm for Learning from Unspecific Reinforcement,
R. Kühn and I.O. Stamatescu,
J. Phys. A 32,5749-5762 (1999)
Neural Networks, H. Horner and R. Kühn, in:
Intelligence and Artificial Intelligence, an Interdisciplinary Debate,
edited by U. Ratsch, M. Richter, and I.O. Stamatescu (Springer, Heidelberg 1998) pp
Averaging and Finite Size Analysis for Disorder: The Hopfield Model
, T. Stiefvater, K. R. Müller, and R. Kühn, Physica A
232 61-73 (1996)
Multiplicity of Metastable Retrieval Phases in Networks of Multistate
Neurons, S. Bös and R. Kühn, J. Stat. Phys. 76 1495-1504
Replica Symmetry Breaking in Attractor Neural Network Models,
H. Steffan and R. Kühn, Z. Phys. B 95 249-260 (1994)
Storage Capacity of a Two-Layer Perceptron with Fixed Preprocessing
in the First Layer, A. Bethge, R. Kühn, and H. Horner, J. Phys.
A27, 1929-1937 (1994).
Multifractality in Forgetful Memories, U. Behn, J.L. van Hemmen,
R. Kühn, A. Lange and V.A. Zagrebnov, Physica D 68, 401-415
Optimal Capacities for Graded-Response Perceptrons, D. Bollé,
R. Kühn, and J. van Mourik, J. Phys. A 26, 3149-3158 (1993)
Statistical Mechanics for Neural Networks with Continuous-Time
Dynamics, R. Kühn, and S. Bös
A 26, 831-857 (1993).
Statistical Mechanics for Networks of Graded-Response Neurons,
R. Kühn, S. Bös and J.L. van Hemmen, Phys. Rev. A 43,
Increasing the Efficiency of a Neural Network Through Unlearning,
J.L. van Hemmen, L.B. Ioffe, R. Kühn, and M. Vaas, in: Proceedings
of the STATPHYS 17 Conference, edited by C. Tsallis, Physica A163,
Increased Storage Capacity for Hierarchically Structured Information
in a Neural Network of Ising Type, L.B. Ioffe, R. Kühn, and J.L.
van Hemmen, J. Phys. A 22, L1037-L1041 (1989).
Hebbian learning reconsidered: Representation of Static and Dynamic
Objects in Associative Neural Nets, A. Herz, B. Sulzer, R. Kühn
and J.L. van Hemmen, Biol. Cybern. 60, 457-467 (1989).
Complex Temporal Association in Neural Networks, R. Kühn,
J.L. van Hemmen, and U. Riedel, J. Phys. A. 22, 3123--3135 (1989).
The Hebb Rule: Representation of Static and Dynamic Objects in
an Associative Neural Network, A. Herz, B. Sulzer, R. Kühn, and J.L.
van Hemmen, Europhys. Lett. 7, 663-669 (1988)
Forgetful Memories, J.L. van Hemmen, G. Keller, and R.
Kühn, Europhys. Lett. 5, 663-668 (1988)
Temporal Sequences and Chaos in Neural Nets, U. Riedel, R.
Kühn, and J.L. van Hemmen, Phys. Rev. A 38, 1105-1108 (1988).
Martingale Approach to Neural Networks with Hierarchically
Structured Information, S. Bös, R. Kühn, and J.L. van Hemmen, Z. Phys.
B 71, 261-271 (1988)
Nonlinear Neural Networks: II. Information Processing, J.L.
van Hemmen, D. Grensing, A. Huber, and R. Kühn, J. Stat. Phys. 50,
Nonlinear Neural Networks: I. General Theory, J.L. van Hemmen,
D. Grensing. A. Huber, and R. Kühn, J. Stat. Phys. 50, 231-257
- Storing Patterns in a Spin-Glass Model of Neural Networks Near
Saturation, D. Grensing, R. Kühn, and J.L. van Hemmen, J. Phys. A
20, 2935-2947 (1987)
Nonlinear Neural Networks, J.L. van Hemmen and R. Kühn,
Phys. Rev. Lett. 57, 913-916 (1986)