Please wait a minute...
Frontiers of Electrical and Electronic Engineering

ISSN 2095-2732

ISSN 2095-2740(Online)

CN 10-1028/TM

Front Elect Electr Eng    2012, Vol. 7 Issue (2) : 200-207    https://doi.org/10.1007/s11460-011-0176-5
RESEARCH ARTICLE
An extended SHESN with leaky integrator neuron and inhibitory connection for Mackey-Glass prediction
Bo YANG, Zhidong DENG()
State Key Laboratory of Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Computer Science and Technology, Beijing 100084, China
 Download: PDF(305 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

Echo state network (ESN) proposed by Jaeger in 2001 has remarkable capabilities of approximating dynamics for complex systems, such as Mackey-Glass problem. Compared to that of ESN, the scale-free highly-clustered ESN, i.e., SHESN, which state reservoir has both small-world phenomenon and scale-free feature, exhibits even stronger approximation capabilities of dynamics and better echo state property. In this paper, we extend the state reservoir of SHESN using leaky integrator neurons and inhibitory connections, inspired from the advances in neurophysiology. We apply the extended SHESN, called e-SHESN, to the Mackey-Glass prediction problem. The experimental results show that the e-SHESN considerably outperforms the SHESN in prediction capabilities of the Mackey-Glass chaotic time-series. Meanwhile, the interesting complex network characteristic in the state reservoir, including the small-world property and the scale-free feature, remains unchanged. In addition, we unveil that the original SHESN may be unstable in some cases. However, the proposed e-SHESN model is shown to be able to address the flaw through the enhancement of the network stability. Specifically, by using the ridge regression instead of the linear regression, the stability of e-SHESN could be much more largely improved.

Keywords echo state network (ESN)      e-SHESN      Mackey-Glass problem      small-world phenomenon      scale-free distribution      ridge regression     
Corresponding Author(s): DENG Zhidong,Email:michael@tsinghua.edu.cn   
Issue Date: 05 June 2012
 Cite this article:   
Bo YANG,Zhidong DENG. An extended SHESN with leaky integrator neuron and inhibitory connection for Mackey-Glass prediction[J]. Front Elect Electr Eng, 2012, 7(2): 200-207.
 URL:  
https://academic.hep.com.cn/fee/EN/10.1007/s11460-011-0176-5
https://academic.hep.com.cn/fee/EN/Y2012/V7/I2/200
Fig.1  e-SHESN network with no input units
Fig.2  1000 neurons are placed on a 300×300 grid plane according to the naturally incremental growth rules. There are 80% excitatory neurons (circle) and 20% inhibitory neurons (cross) in the reservoir
Fig.3  Log-log plot of outdegree of neurons versus rank. The correlation coefficient is 0.9912 with a -value of 0. The absolute value of fitting linear plot slope is 0.578
nameτpointstransformation
data set 1174000tanh(x-1)
data set 23040000.3tanh(x-1)+0.2
Tab.1  Data sets for Mackey-Glass sequence approximation and prediction
Fig.4  500-step teacher subsequence (solid) and training network output subsequences (dashed). (a) e-SHESN runs with data set 1; (b) e-SHESN runs with data set 2
Fig.5  300-step teacher subsequence (solid) and training network output subsequences (dashed). (a) e-SHESN runs with data set 1; (b) e-SHESN runs with data set 2
Fig.6  Attractors of teacher and network output sequence. Network runs with data set 2. (a) Attractor of teacher sequence; (b) attractor of network output sequence
Fig.7  Network output becomes unstable. Output plot (dashed) diverges from actual Mackey-Glass sequence (solid)
SHESNSHESN (with leaky integration neurons)e-SHESN (without leaky integration neurons)
average frequency13/5011/502/50
average onset of unstable state343/1000352/1000551/1000
Tab.2  Comparison of the two networks on Mackey-Glass prediction (using linear regression method)
SHESNe-SHESNe-SHESN(without leaky integration neurons)
average frequency0/500/500/50
average onset of unstable state0/10000/10000/1000
Tab.3  Comparison of the two networks on Mackey-Glass prediction (using ridge regression method)
1 Seung H S. Learning in spiking neural networks by reinforcement of stochastic synaptic transmission. Neuron , 2003, 40(6): 1063-1073
doi: 10.1016/S0896-6273(03)00761-X pmid:14687542
2 Izhikevich E M. Simple model of spiking neurons. IEEE Transactions on Neural Networks , 2003, 14(6): 1569-1572
doi: 10.1109/TNN.2003.820440 pmid:18244602
3 Pavlidis N G, Tasoulis D K, Plagianakos V P, Vrahatis M N. Spiking neural network training using evolutionary algorithms. In: Proceedings of IEEE International Joint Conference on Neural Networks . 2005, 4: 2190-2194
4 Oja M, Kaski S, Kohonen T. Bibliography of self-organizing map (SOM) papers: 1998-2001 addendum. Neural Computing Surveys , 2002, 3(1): 1-156
5 Kohonen T. Self-Organization and Associative Memory. 3rd ed . New York, NY: Springer-Verlag, 1989
6 Bodén M. A guide to recurrent neural networks and backpropagation. SICS Technical Report T2002:03 , 2002
7 Jaeger H. The “echo state” approach to analyzing and training recurrent neural networks. GMD Technical Report 148 , 2001
8 Maass W, Natschl?ger T, Markram H. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation , 2002, 14(11): 2531-2560
doi: 10.1162/089976602760407955 pmid:12433288
9 Jaeger H, Haas H. Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science , 2004, 304(5667): 78-80
doi: 10.1126/science.1091277 pmid:15064413
10 Fette G, Eggert J. Short term memory and pattern matching with simple echo state networks. Lecture notes in Computer Science , 2005, 3696: 13-18
11 Jaeger H. A tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the “echo state network” approach. GMD Report 159 , 2002
12 Mackey M C, Glass L. Oscillation and chaos in physiological control systems. Science , 1977, 197(4300): 287-289
doi: 267326" target="_blank">10.1126/science. pmid:267326 pmid:267326
13 Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Transactions on Neural Networks , 2007, 18(5): 1364-1375
doi: 10.1109/TNN.2007.894082 pmid:18220186
14 Jaeger H. Short term memory in echo state networks. GMD Technical Report 152 , 2002
15 Barabasi A L, Albert R. Emergence of scaling in random networks. Science , 1999, 286(5439): 509-512
doi: 10.1126/science.286.5439.509 pmid:10521342
16 Medina A, Matta I, Byers J. On the origin of power laws in Internet topologies. ACM SIGCOMM Computer Communication Review , 2000, 30(2): 18-28
doi: 10.1145/505680.505683
17 Watts D J, Strogatz S H. Collective dynamics of ‘small-world’ networks. Nature , 1998, 393(6684): 440-442
doi: 10.1038/30918 pmid:9623998
18 Crovella M, Harchol-Balter M, Murta C. Task assignment in a distributed system: Improving performance by unbalancing load. In: Proceedings of ACM Conference on Measurement and Modeling of Computer Systems . 1998, 268-269
19 Dayan P, Abbott L F. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Cambridge: The MIT Press, 2001
20 Kohonen T. Self-Organization and Associative Memory. 3rd ed . New York, NY: Springer-Verlag, 1989
21 Connors B W, Gutnick M J. Intrinsic firing patterns of diverse neocortical neurons. Trends in Neurosciences , 1990, 13(3): 99-104
doi: 10.1016/0166-2236(90)90185-D pmid:1691879
22 Abbott L F. Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Research Bulletin , 1999, 50(5-6): 303-304
doi: 10.1016/S0361-9230(99)00161-6 pmid:10643408
23 Wyffels F, Schrauwen B, Stroobandt D. Stable output feedback in reservoir computing using ridge regression. Lecture Notes in Computer Science , 2008, 5163: 808-817
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed