Towards Cortex Isomorphic Attractor Neural Networks
Sammanfattning: In this thesis we model the mammalian cerebral cortex withattractor neural networks and study the parallelimplementations of these models. First, we review the size, structure, and scaling laws ofthe cerebral cortex of five mammals; mouse, rat, cat, macaque,and human. Characteristics of the cortex such as time-scales,activity rates, and connectivity are also investigated. Basedon how the cortex is vertically structured and modularized, wepropose a generic model of cortex. In this model we make theassumption that the cortex to a first approximation operates asa fixed-point attractor memory. We review the field ofattractor neural networks and focus on a special type calledPotts neural networks. Second, we implement the generic model of cortex with aBCPNN (Bayesian Confidence Propagating Neural Network). Thecortical BCPNN model is formulated as an attractor neuralnetwork and it is mainly used as an autoassociative memory.Based on the literature review and simulation experiments weanalyze the model with regard to storage capacity and scalingcharacteristics. The analysis of the model provides designprinciples and constraints for cortex sized attractor neuralnetworks. Finally, we study parallel implementations of the BCPNN. Wediscuss the computational requirements of the cortical BCPNNmodel and some related issues. We analyze different levels ofparallelism in the BCPNN and the associated communicationrequirements. An important result is that the communicationwill not be limiting provided that we use spiking units(neurons). We take a closer look at how to implement the BCPNNon cluster-computers. Finally, we provide a brief review ofattractor neural networks implemented in hardware and onparallel computers. The main contributions of this thesis are: (i) A review ofthe size, modularization, and computational structure of themammalian cerebral cortex; (ii) A generic neural network modelof the mammalian cortex; (iii) A thorough review of attractorneural networks and their properties; (iv) The computationalrequirements and constraints of a cortex sized BCPNN (v)Efficient implementation of large scale BCPNNs on parallelcomputers; (vi) A fixed-point arthmitic implementation of theBCPNN learning rule. Keywords:Attractor Neural Networks, Cerebral Cortex,Minicolumns, Hypercolumns, Potts Neural Networks, BCPNN, andParallel Computers
Denna avhandling är EVENTUELLT nedladdningsbar som PDF. Kolla denna länk för att se om den går att ladda ner.