Listing 1 - 5 of 5 |
Sort by
|
Choose an application
From the propagation of neural activity through synapses, to the integration of signals in the dendritic arbor, and the processes determining action potential generation, virtually all aspects of neural processing are plastic. This plasticity underlies the remarkable versatility and robustness of cortical circuits: it enables the brain to learn regularities in its sensory inputs, to remember the past, and to recover function after injury. While much of the research into learning and memory has focused on forms of Hebbian plasticity at excitatory synapses (LTD/LTP, STDP), several other plasticity mechanisms have been characterized experimentally, including the plasticity of inhibitory circuits (Kullmann, 2012), synaptic scaling (Turrigiano, 2011) and intrinsic plasticity (Zhang and Linden, 2003). However, our current understanding of the computational roles of these plasticity mechanisms remains rudimentary at best. While traditionally they are assumed to serve a homeostatic purpose, counterbalancing the destabilizing effects of Hebbian learning, recent work suggests that they can have a profound impact on circuit function (Savin 2010, Vogels 2011, Keck 2012). Hence, theoretical investigation into the functional implications of these mechanisms may shed new light on the computational principles at work in neural circuits. This Research Topic of Frontiers in Computational Neuroscience aims to bring together recent advances in theoretical modeling of different plasticity mechanisms and of their contributions to circuit function. Topics of interest include the computational roles of plasticity of inhibitory circuitry, metaplasticity, synaptic scaling, intrinsic plasticity, plasticity within the dendritic arbor and in particular studies on the interplay between homeostatic and Hebbian plasticity, and their joint contribution to network function.
Computational neuroscience. --- Neuroplasticity. --- Intrinsic Plasticity --- structural plasticity --- heterosynaptic plasticity --- Homeostasis --- reward-modulated learning --- synaptic plasticity --- STDP --- inhibitory plasticity --- metaplasticity --- short-term plasticity
Choose an application
From the propagation of neural activity through synapses, to the integration of signals in the dendritic arbor, and the processes determining action potential generation, virtually all aspects of neural processing are plastic. This plasticity underlies the remarkable versatility and robustness of cortical circuits: it enables the brain to learn regularities in its sensory inputs, to remember the past, and to recover function after injury. While much of the research into learning and memory has focused on forms of Hebbian plasticity at excitatory synapses (LTD/LTP, STDP), several other plasticity mechanisms have been characterized experimentally, including the plasticity of inhibitory circuits (Kullmann, 2012), synaptic scaling (Turrigiano, 2011) and intrinsic plasticity (Zhang and Linden, 2003). However, our current understanding of the computational roles of these plasticity mechanisms remains rudimentary at best. While traditionally they are assumed to serve a homeostatic purpose, counterbalancing the destabilizing effects of Hebbian learning, recent work suggests that they can have a profound impact on circuit function (Savin 2010, Vogels 2011, Keck 2012). Hence, theoretical investigation into the functional implications of these mechanisms may shed new light on the computational principles at work in neural circuits. This Research Topic of Frontiers in Computational Neuroscience aims to bring together recent advances in theoretical modeling of different plasticity mechanisms and of their contributions to circuit function. Topics of interest include the computational roles of plasticity of inhibitory circuitry, metaplasticity, synaptic scaling, intrinsic plasticity, plasticity within the dendritic arbor and in particular studies on the interplay between homeostatic and Hebbian plasticity, and their joint contribution to network function.
Computational neuroscience. --- Neuroplasticity. --- Intrinsic Plasticity --- structural plasticity --- heterosynaptic plasticity --- Homeostasis --- reward-modulated learning --- synaptic plasticity --- STDP --- inhibitory plasticity --- metaplasticity --- short-term plasticity
Choose an application
From the propagation of neural activity through synapses, to the integration of signals in the dendritic arbor, and the processes determining action potential generation, virtually all aspects of neural processing are plastic. This plasticity underlies the remarkable versatility and robustness of cortical circuits: it enables the brain to learn regularities in its sensory inputs, to remember the past, and to recover function after injury. While much of the research into learning and memory has focused on forms of Hebbian plasticity at excitatory synapses (LTD/LTP, STDP), several other plasticity mechanisms have been characterized experimentally, including the plasticity of inhibitory circuits (Kullmann, 2012), synaptic scaling (Turrigiano, 2011) and intrinsic plasticity (Zhang and Linden, 2003). However, our current understanding of the computational roles of these plasticity mechanisms remains rudimentary at best. While traditionally they are assumed to serve a homeostatic purpose, counterbalancing the destabilizing effects of Hebbian learning, recent work suggests that they can have a profound impact on circuit function (Savin 2010, Vogels 2011, Keck 2012). Hence, theoretical investigation into the functional implications of these mechanisms may shed new light on the computational principles at work in neural circuits. This Research Topic of Frontiers in Computational Neuroscience aims to bring together recent advances in theoretical modeling of different plasticity mechanisms and of their contributions to circuit function. Topics of interest include the computational roles of plasticity of inhibitory circuitry, metaplasticity, synaptic scaling, intrinsic plasticity, plasticity within the dendritic arbor and in particular studies on the interplay between homeostatic and Hebbian plasticity, and their joint contribution to network function.
Computational neuroscience. --- Neuroplasticity. --- Intrinsic Plasticity --- structural plasticity --- heterosynaptic plasticity --- Homeostasis --- reward-modulated learning --- synaptic plasticity --- STDP --- inhibitory plasticity --- metaplasticity --- short-term plasticity --- Intrinsic Plasticity --- structural plasticity --- heterosynaptic plasticity --- Homeostasis --- reward-modulated learning --- synaptic plasticity --- STDP --- inhibitory plasticity --- metaplasticity --- short-term plasticity
Choose an application
Artificial Intelligence (AI) has found many applications in the past decade due to the ever increasing computing power. Artificial Neural Networks are inspired in the brain structure and consist in the interconnection of artificial neurons through artificial synapses. Training these systems requires huge amounts of data and, after the network is trained, it can recognize unforeseen data and provide useful information. The so-called Spiking Neural Networks behave similarly to how the brain functions and are very energy efficient. Up to this moment, both spiking and conventional neural networks have been implemented in software programs running on conventional computing units. However, this approach requires high computing power, a large physical space and is energy inefficient. Thus, there is an increasing interest in developing AI tools directly implemented in hardware. The first hardware demonstrations have been based on CMOS circuits for neurons and specific communication protocols for synapses. However, to further increase training speed and energy efficiency while decreasing system size, the combination of CMOS neurons with memristor synapses is being explored. The memristor is a resistor with memory which behaves similarly to biological synapses. This book explores the state-of-the-art of neuromorphic circuits implementing neural networks with memristors for AI applications.
graphene oxide --- artificial neural network --- simulation --- neural networks --- STDP --- neuromorphics --- spiking neural network --- artificial intelligence --- hierarchical temporal memory --- synaptic weight --- optimization --- transistor-like devices --- multiscale modeling --- memristor crossbar --- spike-timing-dependent plasticity --- memristor-CMOS hybrid circuit --- pavlov --- wire resistance --- AI --- neocortex --- synapse --- character recognition --- resistive switching --- electronic synapses --- defect-tolerant spatial pooling --- emulator --- compact model --- deep learning networks --- artificial synapse --- circuit design --- memristors --- neuromorphic engineering --- memristive devices --- OxRAM --- neural network hardware --- sensory and hippocampal responses --- neuromorphic hardware --- boost-factor adjustment --- RRAM --- variability --- Flash memories --- neuromorphic --- reinforcement learning --- laser --- memristor --- hardware-based deep learning ICs --- temporal pooling --- self-organization maps --- crossbar array --- pattern recognition --- strongly correlated oxides --- vertical RRAM --- autocovariance --- neuromorphic computing --- synaptic device --- cortical neurons --- time series modeling --- spiking neural networks --- neuromorphic systems --- synaptic plasticity
Choose an application
What is the future of CMOS? Sustaining increased transistor densities along the path of Moore's Law has become increasingly challenging with limited power budgets, interconnect bandwidths, and fabrication capabilities. In the last decade alone, transistors have undergone significant design makeovers; from planar transistors of ten years ago, technological advancements have accelerated to today's FinFETs, which hardly resemble their bulky ancestors. FinFETs could potentially take us to the 5-nm node, but what comes after it? From gate-all-around devices to single electron transistors and two-dimensional semiconductors, a torrent of research is being carried out in order to design the next transistor generation, engineer the optimal materials, improve the fabrication technology, and properly model future devices. We invite insight from investigators and scientists in the field to showcase their work in this Special Issue with research papers, short communications, and review articles that focus on trends in micro- and nanotechnology from fundamental research to applications.
MOSFET --- n/a --- total ionizing dose (TID) --- low power consumption --- process simulation --- two-dimensional material --- negative-capacitance --- power consumption --- technology computer aided design (TCAD) --- thin-film transistors (TFTs) --- band-to-band tunneling (BTBT) --- nanowires --- inversion channel --- metal oxide semiconductor field effect transistor (MOSFET) --- spike-timing-dependent plasticity (STDP) --- field effect transistor --- segregation --- systematic variations --- Sentaurus TCAD --- indium selenide --- nanosheets --- technology computer-aided design (TCAD) --- high-? dielectric --- subthreshold bias range --- statistical variations --- fin field effect transistor (FinFET) --- compact models --- non-equilibrium Green’s function --- etching simulation --- highly miniaturized transistor structure --- compact model --- silicon nanowire --- surface potential --- Silicon-Germanium source/drain (SiGe S/D) --- nanowire --- plasma-aided molecular beam epitaxy (MBE) --- phonon scattering --- mobility --- silicon-on-insulator --- drain engineered --- device simulation --- variability --- semi-floating gate --- synaptic transistor --- neuromorphic system --- theoretical model --- CMOS --- ferroelectrics --- tunnel field-effect transistor (TFET) --- SiGe --- metal gate granularity --- buried channel --- ON-state --- bulk NMOS devices --- ambipolar --- piezoelectrics --- tunnel field effect transistor (TFET) --- FinFETs --- polarization --- field-effect transistor --- line edge roughness --- random discrete dopants --- radiation hardened by design (RHBD) --- low energy --- flux calculation --- doping incorporation --- low voltage --- topography simulation --- MOS devices --- low-frequency noise --- high-k --- layout --- level set --- process variations --- subthreshold --- metal gate stack --- electrostatic discharge (ESD) --- non-equilibrium Green's function
Listing 1 - 5 of 5 |
Sort by
|