The Digital Synaptic Neural Substrate

This book describes a new computational approach to creativity.

The Digital Synaptic Neural Substrate

This book describes a new computational approach to creativity. With chess as the domain of investigation, the authors show experimentally how a computer can be imbued with the 'spark' of creativity that enables it to compose chess problems or puzzles that are both challenging and aesthetically appealing to humans. This new approach called the Digital Synaptic Neural Substrate (DSNS) mimics the brain's ability to combine fragments of seemingly unrelated information from different domains (such as chess, photographs and music) to inspire itself to create new objects in any of them. Representing the cutting edge in computational creativity research, this book will be useful to students, educators and researchers in the field as well as artificial intelligence (AI) practitioners, in general.

Chesthetica s Book of Chess Constructs Volume 2

This book is suitable for intermediate players and above interested in puzzles that are both challenging yet not too esoteric. It also serves as a useful reference to enthusiasts and scientists in the field of artificial intelligence (AI).

Chesthetica s Book of Chess Constructs  Volume 2

Chesthetica's Book of Chess Constructs, Volume 2 features 60 original three-move, four-move, five-move and study-like chess problems created by the world's most advanced automatic chess problem composer using the Digital Synaptic Neural Substrate (DSNS) computational creativity approach. This book is suitable for intermediate players and above interested in puzzles that are both challenging yet not too esoteric. It also serves as a useful reference to enthusiasts and scientists in the field of artificial intelligence (AI).

Chesthetica s Book of Chess Constructs Volume 4

This book contains three-movers, four-movers, five-movers and study-like constructs chosen by the author for your analysis and enjoyment. The solutions are also provided in the last chapter.

Chesthetica s Book of Chess Constructs  Volume 4

Chesthetica's Book of Chess Constructs, Volume 4 features 80 original chess problems by the world's most advanced automatic chess problem composer. Chesthetica incorporates the 'Digital Synaptic Neural Substrate' (DSNS) computational creativity approach. This book contains three-movers, four-movers, five-movers and study-like constructs chosen by the author for your analysis and enjoyment. The solutions are also provided in the last chapter. Accessible to researchers, enthusiasts and players of all levels, only basic knowledge of chess play and notation is assumed.

Chesthetica s Book of Chess Constructs Volume 1

This book is suitable for beginners and intermediate players interested in puzzles that are not too complicated and resemble what might occur in actual games yet are aesthetically pleasing or interesting.

Chesthetica s Book of Chess Constructs  Volume 1

Chesthetica's Book of Chess Constructs, Volume 1 features 50 original three-move chess problems created by the world's most advanced automatic chess problem composer using various artificial intelligence (AI) approaches including, primarily, the new Digital Synaptic Neural Substrate (DSNS) technology. This book is suitable for beginners and intermediate players interested in puzzles that are not too complicated and resemble what might occur in actual games yet are aesthetically pleasing or interesting. It can also serve as a useful reference to researchers and scientists in the field of AI, specifically computational creativity.

Digital Information Processing and Communications

This way, the new substrate will undergo some changes (into the receptors conformations) inducing modified binding affinity degrees of the neuron. Consequences. Changes of the binding affinity degrees induce modified synaptic ...

Digital Information Processing and Communications

This two-volume-set (CCIS 188 and CCIS 189) constitutes the refereed proceedings of the International Conference on Digital Information Processing and Communications, ICDIPC 2011, held in Ostrava, Czech Republic, in July 2011. The 91 revised full papers of both volumes presented together with 4 invited talks were carefully reviewed and selected from 235 submissions. The papers are organized in topical sections on network security; Web applications; data mining; neural networks; distributed and parallel processing; biometrics technologies; e-learning; information ethics; image processing; information and data management; software engineering; data compression; networks; computer security; hardware and systems; multimedia; ad hoc network; artificial intelligence; signal processing; cloud computing; forensics; security; software and systems; mobile networking; and some miscellaneous topics in digital information and communications.

Digital brain atlases

For making the neural circuit model, we need not only the site of synaptic connection but also the channel ... Motion sensitive descending interneurons, ocellar LD neurons and neck motoneurons in the bee: a neural substrate for visual ...

Digital brain atlases


The Digital Image and Reality

... and what alterations they passively make when they take up residence in the neural substrate of our memory. ... of 'virtual' media experience as much as of 'real' experience, cemented in the synaptic connections of the brain.

The Digital Image and Reality

The media technologies that surround and suffuse our everyday life profoundly affect our relation to reality. Philosophers since Plato and Aristotle have sought to understand the complex influence of apparently simple tools of expression on our understanding and experience of the world, time, space, materiality and energy. The Digital Image and Reality takes up this crucial philosophical task for our digital era. This rich yet accessible work argues that when new visual technologies arrive to represent and simulate reality, they give rise to nothing less than a radically different sensual image of the world. Through engaging with post-cinematic content and the new digital formats in which it appears, Strutt uncovers and explores how digital image-making is integral to emergent modes of metaphysical reflection - to speculative futurism, optimistic nihilism, and ethical plasticity. Ultimately, he prompts the reader to ask whether the impact of digital image processes might go even beyond our subjective consciousness of reality, towards the synthesis of objective actuality itself.

Digital Endocasts

The cerebellar circuits have been found to exhibit long-term synaptic plasticity, indicating that ... These cerebro-cerebellar connections provide a neural substrate by which the cerebellum could contribute to higher cognitive functions ...

Digital Endocasts

This book is dedicated to a specific component of paleoneurology, probably the most essential one: endocasts. A series of original papers collected here focuses on describing methods and techniques that are dedicated to reconstruct and study fossil endocasts through computed tools. The book is particularly oriented toward hominid paleoneurology, although it also includes chapters on different taxa to provide a more general view of current perspectives and problems in evolutionary neuroanatomy. The first part of the book concerns techniques and tools to cast endocranial anatomy. The second part deals with computed morphometrics, and the third part is devoted to comparative neurobiology. Those who want to approach the field in general terms will find this book especially helpful, as will those researchers working with endocranial anatomy and brain evolution. The book will also be useful for researchers and graduate students in anthropology, bioarchaeology, medicine, and related fields.

From Neuron to Cognition via Computational Neuroscience

Operant matching is a generic outcome of synaptic plasticity based on the covariance between reward and neural activity. ... Tests on a cell assembly theory of the action of the brain, using a large digital computer.

From Neuron to Cognition via Computational Neuroscience

A comprehensive, integrated, and accessible textbook presenting core neuroscientific topics from a computational perspective, tracing a path from cells and circuits to behavior and cognition. This textbook presents a wide range of subjects in neuroscience from a computational perspective. It offers a comprehensive, integrated introduction to core topics, using computational tools to trace a path from neurons and circuits to behavior and cognition. Moreover, the chapters show how computational neuroscience—methods for modeling the causal interactions underlying neural systems—complements empirical research in advancing the understanding of brain and behavior. The chapters—all by leaders in the field, and carefully integrated by the editors—cover such subjects as action and motor control; neuroplasticity, neuromodulation, and reinforcement learning; vision; and language—the core of human cognition. The book can be used for advanced undergraduate or graduate level courses. It presents all necessary background in neuroscience beyond basic facts about neurons and synapses and general ideas about the structure and function of the human brain. Students should be familiar with differential equations and probability theory, and be able to pick up the basics of programming in MATLAB and/or Python. Slides, exercises, and other ancillary materials are freely available online, and many of the models described in the chapters are documented in the brain operation database, BODB (which is also described in a book chapter). Contributors Michael A. Arbib, Joseph Ayers, James Bednar, Andrej Bicanski, James J. Bonaiuto, Nicolas Brunel, Jean-Marie Cabelguen, Carmen Canavier, Angelo Cangelosi, Richard P. Cooper, Carlos R. Cortes, Nathaniel Daw, Paul Dean, Peter Ford Dominey, Pierre Enel, Jean-Marc Fellous, Stefano Fusi, Wulfram Gerstner, Frank Grasso, Jacqueline A. Griego, Ziad M. Hafed, Michael E. Hasselmo, Auke Ijspeert, Stephanie Jones, Daniel Kersten, Jeremie Knuesel, Owen Lewis, William W. Lytton, Tomaso Poggio, John Porrill, Tony J. Prescott, John Rinzel, Edmund Rolls, Jonathan Rubin, Nicolas Schweighofer, Mohamed A. Sherif, Malle A. Tagamets, Paul F. M. J. Verschure, Nathan Vierling-Claasen, Xiao-Jing Wang, Christopher Williams, Ransom Winder, Alan L. Yuille

Emerging Memory and Computing Devices in the Era of Intelligent Machines

Building block of a programmable neuromorphic substrate: A digital neurosynaptic core. In Proceedings of the 2012 International Joint Conference on Neural Networks (IJCNN), Brisbane, Australia, 10–15 June 2012; pp. 1–8. 6.

Emerging Memory and Computing Devices in the Era of Intelligent Machines

Computing systems are undergoing a transformation from logic-centric towards memory-centric architectures, where overall performance and energy efficiency at the system level are determined by the density, performance, functionality and efficiency of the memory, rather than the logic sub-system. This is driven by the requirements of data-intensive applications in artificial intelligence, autonomous systems, and edge computing. We are at an exciting time in the semiconductor industry where several innovative device and technology concepts are being developed to respond to these demands, and capture shares of the fast growing market for AI-related hardware. This special issue is devoted to highlighting, discussing and presenting the latest advancements in this area, drawing on the best work on emerging memory devices including magnetic, resistive, phase change, and other types of memory. The special issue is interested in work that presents concepts, ideas, and recent progress ranging from materials, to memory devices, physics of switching mechanisms, circuits, and system applications, as well as progress in modeling and design tools. Contributions that bridge across several of these layers are especially encouraged.

Space Time Computing with Temporal Neural Networks

“Building block of a programmable neuromorphic substrate: A digital neurosynaptic core.” The 2012 International Joint Conference on Neural Networks (IJCNN) (2012): 1–8. DOI: 10.1109/ijcnn.2012.6252637.44, 106 2.

Space Time Computing with Temporal Neural Networks

Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author. As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathematics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation. Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarchical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.

Advances in Neural Information Processing Systems 12

30332-0250 Abstract We have developed and tested an analog / digital VLSI system that models the coordination of biological segmental ... The neural substrate for these control mechanisms are called central pattern generators ( CPG ) .

Advances in Neural Information Processing Systems 12

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented.

Neural Information Processing and VLSI

analog values can be provided as the synapse output in proportional to the stored digital weights [27, 28, 29, 30]. ... formed by an MOS transistor is decayed due to the leakage current through the diffusion-substrate junction.

Neural Information Processing and VLSI

Neural Information Processing and VLSI provides a unified treatment of this important subject for use in classrooms, industry, and research laboratories, in order to develop advanced artificial and biologically-inspired neural networks using compact analog and digital VLSI parallel processing techniques. Neural Information Processing and VLSI systematically presents various neural network paradigms, computing architectures, and the associated electronic/optical implementations using efficient VLSI design methodologies. Conventional digital machines cannot perform computationally-intensive tasks with satisfactory performance in such areas as intelligent perception, including visual and auditory signal processing, recognition, understanding, and logical reasoning (where the human being and even a small living animal can do a superb job). Recent research advances in artificial and biological neural networks have established an important foundation for high-performance information processing with more efficient use of computing resources. The secret lies in the design optimization at various levels of computing and communication of intelligent machines. Each neural network system consists of massively paralleled and distributed signal processors with every processor performing very simple operations, thus consuming little power. Large computational capabilities of these systems in the range of some hundred giga to several tera operations per second are derived from collectively parallel processing and efficient data routing, through well-structured interconnection networks. Deep-submicron very large-scale integration (VLSI) technologies can integrate tens of millions of transistors in a single silicon chip for complex signal processing and information manipulation. The book is suitable for those interested in efficient neurocomputing as well as those curious about neural network system applications. It has been especially prepared for use as a text for advanced undergraduate and first year graduate students, and is an excellent reference book for researchers and scientists working in the fields covered.

Event Based Neuromorphic Systems

... Alvarez R, Cassidy A, Chandra S, Esser S, Imam N, Risk W, Rubin D, Manohar R, and Modha D. 2012. Building block of a programmable neuromorphic substrate: a digital neurosynaptic core. Proc. IEEE Int. Joint Conf. Neural Netw ...

Event Based Neuromorphic Systems

Neuromorphic electronic engineering takes its inspiration from the functioning of nervous systems to build more power efficient electronic sensors and processors. Event-based neuromorphic systems are inspired by the brain's efficient data-driven communication design, which is key to its quick responses and remarkable capabilities. This cross-disciplinary text establishes how circuit building blocks are combined in architectures to construct complete systems. These include vision and auditory sensors as well as neuronal processing and learning circuits that implement models of nervous systems. Techniques for building multi-chip scalable systems are considered throughout the book, including methods for dealing with transistor mismatch, extensive discussions of communication and interfacing, and making systems that operate in the real world. The book also provides historical context that helps relate the architectures and circuits to each other and that guides readers to the extensive literature. Chapters are written by founding experts and have been extensively edited for overall coherence. This pioneering text is an indispensable resource for practicing neuromorphic electronic engineers, advanced electrical engineering and computer science students and researchers interested in neuromorphic systems. Key features: Summarises the latest design approaches, applications, and future challenges in the field of neuromorphic engineering. Presents examples of practical applications of neuromorphic design principles. Covers address-event communication, retinas, cochleas, locomotion, learning theory, neurons, synapses, floating gate circuits, hardware and software infrastructure, algorithms, and future challenges.

Analog VLSI Neural Networks

If the function of a network is known in advance and there is no need to change the synaptic weight during operation, resistive weights give a reasonable solution. An A/D converter based on a neural network approach is a good example of ...

Analog VLSI Neural Networks

This book brings together in one place important contributions and state-of-the-art research in the rapidly advancing area of analog VLSI neural networks. The book serves as an excellent reference, providing insights into some of the most important issues in analog VLSI neural networks research efforts.

Neural Computation in Embodied Closed Loop Systems for the Generation of Complex Behavior From Biology to Technology

Neuromorphic electronic circuits can implement dynamics of neurons and synapses using digital (Furber et al., ... neural networks, which face the same problem of using an unreliable computing substrate that consists of noisy neurons and ...

Neural Computation in Embodied Closed Loop Systems for the Generation of Complex Behavior  From Biology to Technology

How can neural and morphological computations be effectively combined and realized in embodied closed-loop systems (e.g., robots) such that they can become more like living creatures in their level of performance? Understanding this will lead to new technologies and a variety of applications. To tackle this research question, here, we bring together experts from different fields (including Biology, Computational Neuroscience, Robotics, and Artificial Intelligence) to share their recent findings and ideas and to update our research community. This eBook collects 17 cutting edge research articles, covering neural and morphological computations as well as the transfer of results to real world applications, like prosthesis and orthosis control and neuromorphic hardware implementation.

Neural Models of Plasticity

Hebb's proposal for the neural substrate of learning has some elements that make it implementational, inasmuch as he specified the conditions under which synapses are to be modified. However, he did not specify exactly which synapses, ...

Neural Models of Plasticity

Neural Models of Plasticity: Experimental and Theoretical Approaches is an outgrowth of a conference that was held at Woods Hole, Massachusetts, in the spring of 1987. The purpose of that conference was to review recent developments in both areas and to foster communication between those researchers pursuing theoretical approaches and those pursuing more empirical approaches. Contributions have been solicited from individuals who represent both ends of the spectrum of approaches as well as those using a combination of the two. These indicate that our knowledge of the plastic capabilities of the nervous system is accelerating rapidly due to rapid advances in the understanding of basic subcellular and molecular mechanisms of plasticity, and because of the computational capabilities and plastic properties that emerge from neural networks and assemblies. The book contains 19 chapters and opens with a study on the role of the neuromodulator in associative learning of the marine mollusk Hermissend. Subsequent chapters examine topics such as learning and memory in Aplysia; the Hebb rule for synaptic plasticity; olfactory processing and associative memory in the mollusk Limax maximus; simulation of a classically conditioned response; and the neural substrates of memory, focusing on the role of the hippocampus.

Neuromorphic Engineering Systems and Applications

However, the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate.

Neuromorphic Engineering Systems and Applications

Neuromorphic engineering has just reached its 25th year as a discipline. In the first two decades neuromorphic engineers focused on building models of sensors, such as silicon cochleas and retinas, and building blocks such as silicon neurons and synapses. These designs have honed our skills in implementing sensors and neural networks in VLSI using analog and mixed mode circuits. Over the last decade the address event representation has been used to interface devices and computers from different designers and even different groups. This facility has been essential for our ability to combine sensors, neural networks, and actuators into neuromorphic systems. More recently, several big projects have emerged to build very large scale neuromorphic systems. The Telluride Neuromorphic Engineering Workshop (since 1994) and the CapoCaccia Cognitive Neuromorphic Engineering Workshop (since 2009) have been instrumental not only in creating a strongly connected research community, but also in introducing different groups to each other’s hardware. Many neuromorphic systems are first created at one of these workshops. With this special research topic, we showcase the state-of-the-art in neuromorphic systems.

Artificial Neural Networks in Pattern Recognition

... serious issue in large digital designs, one can envisage trainable systems working as well on imperfect, or even partly damaged, substrates. A mixed-signal analog/digital Very-Large-Scale Integration neural network architecture has ...

Artificial Neural Networks in Pattern Recognition

This book constitutes the refereed proceedings of the Second IAPR Workshop on Artificial Neural Networks in Pattern Recognition, ANNPR 2006, held in Ulm, Germany in August/September 2006. The 26 revised papers presented were carefully reviewed and selected from 49 submissions. The papers are organized in topical sections on unsupervised learning, semi-supervised learning, supervised learning, support vector learning, multiple classifier systems, visual object recognition, and data mining in bioinformatics.