Memory Dynamics in Attractor NetworksReportar como inadecuado




Memory Dynamics in Attractor Networks - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Computational Intelligence and Neuroscience - Volume 2015 2015, Article ID 191745, 7 pages -

Research Article

Centre for Brain Inspired Computing Research CBICR, Department of Precision Instrument, Tsinghua University, Beijing 100084, China

Department of Advanced Concepts and Nanotechnology ACN, Data Storage Institute, A*STAR, 5 Engineer Drive 1, Singapore 117608

School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 639798

Received 25 November 2014; Revised 30 January 2015; Accepted 30 January 2015

Academic Editor: Klaus Obermayer

Copyright © 2015 Guoqi Li et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

As can be represented by neurons and their synaptic connections, attractor networks are widely believed to underlie biological memory systems and have been used extensively in recent years to model the storage and retrieval process of memory. In this paper, we propose a new energy function, which is nonnegative and attains zero values only at the desired memory patterns. An attractor network is designedbased on the proposed energy function. It is shown that the desired memory patterns are stored as the stable equilibrium points of the attractor network. To retrieve a memory pattern, an initial stimulus input is presented to the network, and its states converge to one of stable equilibrium points. Consequently, the existence of the spurious points, that is, local maxima, saddle points, or other local minima which are undesired memory patterns, canbe avoided. The simulation results show the effectiveness of the proposed method.





Autor: Guoqi Li, Kiruthika Ramanathan, Ning Ning, Luping Shi, and Changyun Wen

Fuente: https://www.hindawi.com/



DESCARGAR PDF




Documentos relacionados