[go: up one dir, main page]

EP4566001A1 - Neuromorphe schaltung zur physikalischen erzeugung eines neuronalen netzes und zugehöriges produktions- und inferenzverfahren - Google Patents

Neuromorphe schaltung zur physikalischen erzeugung eines neuronalen netzes und zugehöriges produktions- und inferenzverfahren

Info

Publication number
EP4566001A1
EP4566001A1 EP23749094.1A EP23749094A EP4566001A1 EP 4566001 A1 EP4566001 A1 EP 4566001A1 EP 23749094 A EP23749094 A EP 23749094A EP 4566001 A1 EP4566001 A1 EP 4566001A1
Authority
EP
European Patent Office
Prior art keywords
excitation
component
modes
neural network
configuration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23749094.1A
Other languages
English (en)
French (fr)
Inventor
Paolo Bortolotti
Abdelmadjid Anane
Vincent Cros
Joo-Von Kim
Grégoire DE LOUBENS
Alfredo De Rossi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Centre National de la Recherche Scientifique CNRS
Thales SA
Commissariat a lEnergie Atomique et aux Energies Alternatives CEA
Universite Paris Saclay
Original Assignee
Centre National de la Recherche Scientifique CNRS
Commissariat a lEnergie Atomique CEA
Thales SA
Commissariat a lEnergie Atomique et aux Energies Alternatives CEA
Universite Paris Saclay
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Centre National de la Recherche Scientifique CNRS, Commissariat a lEnergie Atomique CEA, Thales SA, Commissariat a lEnergie Atomique et aux Energies Alternatives CEA, Universite Paris Saclay filed Critical Centre National de la Recherche Scientifique CNRS
Publication of EP4566001A1 publication Critical patent/EP4566001A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/067Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using optical means
    • G06N3/0675Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using optical means using electro-optical, acousto-optical or opto-electronic means
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F3/00Optical logic elements; Optical bistable devices
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • G06N3/065Analogue means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Definitions

  • Neuromorphic circuit physically realizing a neural network and associated production and inference method
  • the present invention relates to a neuromorphic circuit physically realizing a neural network. It also relates to a process of associated realization and inference.
  • a CPU is a processor, the acronym CPU coming from the English term “Central Processing Unit” literally meaning central processing unit while a GPU is a graphics processor, the acronym GPU coming from the English term “Graphics Processing Unit” literally meaning graphics unit treatment.
  • a neural network is generally composed of a succession of layers of neurons, each of which takes its inputs from the outputs of the previous layer. More precisely, each layer includes neurons taking their inputs from the outputs of the neurons of the previous layer. Each layer is connected by a plurality of synapses. A synaptic weight is associated with each synapse. It is a real number, which takes positive and negative values. For each layer, the input of a neuron is the weighted sum of the outputs of the neurons of the previous layer, the weighting being done by the synaptic weights.
  • Von Neumann bottleneck also called “Von Neumann bottleneck” according to its English name
  • a problem of Von Neumann bottleneck appears due to the fact that the implementation of a deep neural network (more of three layers and up to several dozen) involves using both the memory(s) and the processor while the latter elements are spatially separated. This results in congestion of the communication bus between the memory(s) and the processor both while the neural network, once trained, is used to carry out a task, and, even more so, while the neural network is trained, that is to say while its synaptic weights are adjusted to solve the task in question with maximum performance.
  • CMOS complementary metal oxide semiconductor
  • CMOS complementary metal oxide semiconductor
  • a neural network based on optical type technologies is also known.
  • CMOS neural networks and CMOS synapses are synapses using memristors.
  • memristor or memristance
  • the name is a portmanteau formed from the two English words memory and resistor.
  • a memristor is a non-volatile memory component, the value of its electrical resistance changing with the application of a voltage for a certain duration and remaining at that value in the absence of voltage.
  • each neuron occupies several tens of micrometers on a side.
  • each synapse also occupies several tens of micrometers on a side. The result is that, on a limited surface corresponding for example to an electronic chip, the number of neurons and synapses that can be integrated is limited, which results in a reduction in the performance of the neural network.
  • the description describes a neuromorphic circuit physically producing a neural network, the neural network comprising a set of neurons connected by synapses, the neuromorphic circuit comprising:
  • each component able to be excited according to a plurality of specific excitation modes with a respective population, the component thus being adapted to present several excitation configurations, each excitation configuration being defined by the specific excitation modes excited and their respective population, each pair of excitation modes being coupled by a respective coupling, each specific excitation mode being a neuron of the neural network and each coupling being a synapse of the neural network,
  • the configuration unit being able to excite the component to obtain an excitation configuration chosen as a function of a desired neural network architecture, the desired architecture comprising neurons d input and output neurons, and
  • the component interrogation unit being able to selectively modify the populations of first own excitation modes and to measure the populations of second own excitation modes, the first modes own excitation being the modes of excitation of the input neurons and the second proper excitation modes being the excitation modes of the output neurons.
  • the neuromorphic circuit has one or more of the following characteristics, taken in isolation or in all technically possible combinations:
  • the component is capable of being excited in a regime in which the amplitude of a coupling between each pair of specific excitation modes in each excitation configuration depends on the respective populations of each of the two specific excitation modes;
  • the at least one component has two regimes, a first regime in which the amplitude of the coupling between each pair of specific excitation modes is independent of the population of each of the two specific excitation modes and a second regime in which the amplitude of the coupling between each pair of specific excitation modes depends on the population of each of the two specific excitation modes, the configuration unit being suitable for exciting the component in the second regime;
  • the configuration unit is capable of exciting the specific excitation modes according to an initial configuration different from the chosen excitation configuration, the component relaxing towards the chosen excitation configuration;
  • the at least one component is a layer of ferromagnetic element
  • the ferromagnetic element is an iron and yttrium garnet
  • the at least one component is chosen from the list consisting of: a magnetic microstructure, a cavity, a metamaterial, and a superconducting resonator;
  • the configuration unit is chosen from an optical excitation unit, and an electrical excitation unit.
  • the interrogation unit is chosen from: an optical excitation unit, and an electrical excitation unit.
  • the description also describes a method for physically producing a neural network having a desired architecture, the neural network comprising a set of neurons connected by synapses, the physical production method comprising:
  • each excitation configuration being defined by the modes of excited own excitations and their respective population, each pair of excitation modes being coupled by a respective coupling, each own excitation mode being a neuron of the neural network and each coupling being a synapse of the neural network, the configuration step being implemented by a configuration unit of the at least one component, the configuration unit forming part of the neuromorphic circuit and comprising the excitation of the at least one component to obtain an excitation configuration chosen as a function of the desired architecture.
  • the description also relates to a method for inferring a neural network having a desired architecture, the neural network comprising a set of neurons connected by synapses, at least one component of a neuromorphic circuit physically producing a neural network having been configured, the at least one component being capable of being excited according to a plurality of specific excitation modes with a respective population, the component thus being capable of presenting several excitation configurations, each excitation configuration being defined by the excited proper excitation modes and their respective population, each pair of excitation modes being coupled by a respective coupling, each proper excitation mode being a neuron of the neural network and each coupling being a synapse of the neural network, the configuration having been implemented by a configuration unit of the at least one component, the configuration unit forming part of the neuromorphic circuit and comprising the excitation of the at least one component to obtain an excitation configuration chosen as a function of the desired architecture, the inference process comprising:
  • the measuring step being carried out after reaching a stable state for the at least one component, the measurement step being implemented by the interrogation unit.
  • FIG. 1 is a schematic representation of a neuromorphic circuit
  • FIG. 2 is a schematic representation of an example of a neural network
  • FIG. 3 is a flowchart of an example of implementation of an inference method using the neuromorphic circuit of Figure 1, and
  • FIG. 4 is an example of an experimental implementation of the neuromorphic circuit of Figure 1.
  • a neuromorphic circuit 10 is described.
  • the neuromorphic circuit 10 is a circuit physically producing a network of neurons 12.
  • the neuromorphic circuit 10 is a physical circuit adapted to carry out the operations of a neural network 12.
  • the neuromorphic circuit 10 is suitable for implementing a neural network 12 as shown schematically in Figure 2.
  • the neural network 12 described is a network comprising an ordered succession of layers 14 of neurons 16, each of which takes its inputs from the outputs of the previous layer 14.
  • a neuron or nerve cell
  • Neurons ensure the transmission of a bioelectric signal called nerve impulses.
  • Neurons have two physiological properties: excitability, that is to say the capacity to respond to stimulation and to convert them into nerve impulses, and conductivity, that is to say the capacity to transmit signals. impulses.
  • excitability that is to say the capacity to respond to stimulation and to convert them into nerve impulses
  • conductivity that is to say the capacity to transmit signals. impulses.
  • activation a mathematical function, called activation, which has the property of being non-linear (to be able to transform the input in a useful way) and preferably of being differentiable (to allow learning by backpropagation of the gradient).
  • the activation function in this case can be represented as a function giving the variation of the average pulse emission frequency with the input current.
  • each layer 14 comprises neurons 16 taking their inputs from the outputs of the neurons 16 of the previous layer 14.
  • the neural network 12 described is a network comprising a single hidden layer of neurons 18.
  • this number of hidden layers of neurons is not limiting.
  • the uniqueness of the hidden layer of neurons 18 means that the neural network 12 includes an input layer 20 followed by the hidden layer of neurons 18, itself followed by an output layer 22.
  • the layers are indexable by an integer index i, the first layer corresponding to input layer 20 and the last to output layer 22.
  • Each layer 14 is connected by a plurality of synapses 24.
  • the synapse designates a functional contact zone which is established between two neurons 16. Depending on its behavior, the biological synapse can excite or inhibit the downstream neuron in response to the upstream neuron.
  • a positive synaptic weight corresponds to an excitatory synapse while a negative synaptic weight corresponds to an inhibitory synapse.
  • Biological neural networks learn by modifying synaptic transmissions throughout the network.
  • formal neural networks can be trained to perform tasks by modifying synaptic weights according to a learning rule.
  • a synapse 24 is a component performing a function equivalent to a synaptic weight of modifiable value.
  • a synaptic weight is therefore associated with each synapse 24. For example, it is a real number, which takes positive as well as negative values.
  • the input of a neuron 16 is the weighted sum of the outputs of the neurons 16 of the previous layer 14, the weighting being done by the synaptic weights.
  • each layer 14 of neurons 16 is fully connected.
  • a fully connected neuron layer is one in which the neurons in the layer are each connected to all the neurons in the previous layer. Such a type of layer is more often called according to the English term of
  • the neural network 12 is a pulse neural network.
  • a spiking neural network is often called by the acronym SNN which refers to the English name “Spiking Neural Network”.
  • a neuron is a dynamic element varying in time as described above and characterized here by its pulse emission frequency.
  • a neuron 16 called pre-synaptic, upstream, emits an impulse
  • the synapse 24 weights this impulse and transmits it to the neuron 16, called postsynaptic, downstream, which possibly in turn emits an impulse.
  • the stimulation transmitted by synapse 24 is a stimulation of a part of the downstream neuron 16, called membrane and presenting a potential. If this membrane potential charges beyond a so-called activation threshold, neuron 16 emits an impulse.
  • synapse 24 performs a multiplication between input weight and activation.
  • the input activation of downstream neuron 16 is the output signal sent by upstream neuron 16.
  • the downstream neuron 16 increases its membrane potential, compares it to a threshold and emits an output pulse when the membrane potential exceeds this threshold.
  • an upstream neuron 16 is permanently activated (like an input neuron) in order to add biases to the membrane potential of the downstream neuron 16 which enrich the expressivity of the function learned by the neural network 12.
  • a neuron 16 is a “bias neuron”.
  • the neurons 16 are connected by a synapse 24 which is bidirectional.
  • the neuromorphic circuit 10 comprises a component 26, a configuration unit 28 of the component 26 and an interrogation unit 30 of the component 26.
  • component 26 is unique but configurations could be considered where component 26 is not unique (see the “other specific examples” paragraph).
  • Component 26 can be excited according to a plurality of specific excitation modes with a respective population.
  • component 26 has 7 excitation modes.
  • Excitation modes or excited states are quantified. This means that the excitation modes can be referenced by an integer i, with the excitation modes usually being ordered by increasing energy.
  • excitation modes also makes it possible to define a reciprocal space in which the excitation modes are characterized by a respective eigenvector. Therefore, in the following, each excitation mode is denoted k, with reference to its eigenvector.
  • the population of an excitation mode k can be represented by the chemical potential ni.
  • an excitation mode k with its respective population can be denoted k(ni).
  • component 26 has two regimes, namely a linear regime (first regime) and a non-linear regime (second regime).
  • the amplitude of the coupling between each pair of natural excitation modes ki to k 7 is independent of the population of each of the two natural excitation modes ki to k 7 .
  • the population of the excitation modes ki to k 7 in the linear regime does not affect the energy of the excitation modes ki to k 7 . It follows that the excitation modes k to k 7 can be considered as quasi-orthogonal in the reciprocal space.
  • the amplitude of the coupling between each pair of natural excitation modes k to k 7 depends on the population of each of the two natural excitation modes k to k 7 .
  • the amplitude of the coupling can also be affected by the population in another excitation mode k to k 7 .
  • the component 26 described thus presents several excitation configurations, each excitation configuration being defined by the specific excitation modes ki to k 7 excited and their respective population and corresponding to operation in the non-linear regime.
  • an excitation configuration corresponds to a set of values ki(ni) for all values of i.
  • Each excitation mode ki to k 7 corresponds to a neuron and each coupling between two excitation modes ki to k 7 corresponding to a synapse between the two neurons associated with the two excitation modes ki to k 7 considered.
  • the neurons of the neural network 12 are produced by the specific excitation modes ki to k 7 of the component 26 while the synapses are produced by the non-linear coupling matrix of the excitation modes ki to k 7 of the component 26 (matrix A presented previously).
  • Component 26 is thus a programmable component capable of creating the architecture of numerous neural networks.
  • architecture is meant the structure of the neural network 12, that is to say in particular, the position of each layer of neurons, the number of layers, the number of neurons in each layer and the links (synapses) between each neuron.
  • the configuration unit 28 of the component 26 is capable of exciting the component 26 to obtain an excitation configuration chosen as a function of a desired neural network architecture.
  • the desired architecture is assumed to be known, knowing that it can in particular be obtained by carrying out training for a predetermined task or chosen by an expert as particularly appropriate for said task.
  • the desired architecture makes it possible to determine a matrix A which corresponds to values of k(ni) for all values of i, that is to say an excitation configuration.
  • the configuration unit 28 is capable of exciting the component 26 in the non-linear regime.
  • the configuration unit 28 of the component 26 is capable of exciting the specific excitation modes ki to k 7 according to an initial configuration different from the chosen excitation configuration.
  • Component 26 then naturally relaxes towards an equilibrium configuration, this equilibrium configuration being the chosen excitation configuration.
  • the configuration unit 28 performs an initial configuration kio(n io ) then the component 26 can relax until reaching a dynamic quasi-equilibrium corresponding to ki(ni) which is the desired configuration.
  • the initial configuration is chosen to be simpler to obtain, for example because it involves populating fewer different excitation modes ki to k 7 .
  • the interrogation unit 30 serves to interrogate the component 26, that is to say, to send input data to the neural network to obtain at least one output data.
  • the interrogation unit 30 comprises two subunits 32 and 34.
  • the first subunit 32 is capable of selectively modifying the populations of first specific excitation modes ki to k 7 , the first specific excitation modes ki to k 7 being the excitation modes of the input neurons (those of the input layer 20).
  • first subunit 32 is a subunit for sending input data to component 26.
  • the second subunit 34 is suitable for measuring the populations of second specific excitation modes ki to k 7 , the second specific excitation modes ki to k 7 being the excitation modes of the output neurons (those of the layer input 22).
  • the second subunit 34 is a subunit for obtaining at least one output data from component 26.
  • This method comprises three steps: a configuration step 40, a modification step E42 and a measurement step E44.
  • configuration unit 28 excites the component 26 to obtain a chosen excitation configuration.
  • the configuration unit 28 receives a control signal indicating the excitation signal to apply (for example, a radio frequency signal with multiple frequencies as proposed in the embodiment of the following paragraph).
  • This excitation signal has been previously calculated, for example by a simulation tool, to make it possible to physically create a desired neural network architecture.
  • the interrogation unit 30 selectively modifies populations of first specific excitation modes.
  • This selective modification corresponds to the fact that it is desired that the neural network performs an inference on input data.
  • the input data is converted into an excitation signal for example by the aforementioned simulation tool and a control unit sends a command indicating to the interrogation unit 30 that it must send said excitation signal.
  • Component 26 then evolves naturally until reaching a stable state after such excitation.
  • the interrogation unit 30 measures the populations of second specific excitation modes.
  • the interrogation unit 30 obtains the output data of the neural network, that is to say its prediction for the input data injected during the modification step E42.
  • This output data is obtained by converting the signal measured by the interrogation unit 30 by the inverse operation of that which was used to obtain the excitation signal for the input data.
  • An inference of the desired neural network is thus carried out in three steps.
  • the present neuromorphic circuit 10 corresponds to a physical realization of a neural network which is qualitatively different from all the physical realizations known to date. Instead of designing and structuring individual nonlinear elements (neurons) and their interconnections (synapses) in real space, it is proposed to realize these elements in a reciprocal space of high dimensionality. To do this, it is sufficient to populate the specific excitation modes of component 26 in the presence of strong non-linearities to couple these excitation modes.
  • the connectivity of the neural network 12 can potentially be increased by several orders of magnitude, since it can use the full spectrum of eigenmodes in a given system. Values as large as 10 6 can be obtained.
  • the neuromorphic circuit 10 is a neuromorphic circuit that is more economical in energy consumption.
  • a neuromorphic circuit 10 realizing highly interconnected neurons can be used for solving problems beyond neuromorphic computing.
  • this neuromorphic circuit 10 could be envisaged to use this neuromorphic circuit 10 to solve difficult non-deterministic polynomial time problems. These problems are often referred to as “NP-hard”.
  • An example is to solve a problem that can be projected onto an Ising model. Such potential could make such a neuromorphic circuit 10 a serious competitor to quantum computer achievements.
  • the neuromorphic circuit 10 is easily reconfigurable. The same neuromorphic circuit 10 can be used to perform different operations. To do this, it is sufficient to modify the signals of the configuration unit 28 and the interrogation unit 30 depending on the intended application.
  • component 26 is made of a ferromagnetic element.
  • the ferromagnetic element is iron and yttrium garnet.
  • Yttrium iron garnet is more commonly referred to as YIG and refers to the element with the chemical formula YsFesO ⁇ .
  • YIG here refers to the corresponding English name “Yttrium Iron Garnet”.
  • Component 26 is here a thin layer.
  • Such a layer is obtained by growth of a layer or a multilayer of which at least one of the layers is magnetic.
  • component 26 The size and geometry of component 26 are chosen to have appropriate energy separation between modes. Typically, for a single layer of YIG forming a cylinder, the size of the base of the cylinder is a few micrometers.
  • the shape of the base is circular (as visible in Figure 4) or rectangular.
  • the thickness of the layer forming component 26 is typically a few tens of nanometers.
  • component 26 is a magnetic microstructure.
  • Component 26 is possibly covered with an overlayer 50 as is the case in the example illustrated.
  • the overlayer 50 serves to dynamically adjust the effective magnetic damping of the component 26.
  • the damping is, by definition, the characteristic life time of the different excitation states.
  • the overlayer 50 is, for example, made of platinum.
  • the overlay 50 is controlled by a current generator.
  • a current generator can deliver direct current or a current forming time slots.
  • the excitation modes therefore correspond here to a discrete frequency spectrum.
  • these excitation modes can be used to realize the neurons of the neural network.
  • these couplings can be used to realize the synapses of the neural network benefiting from high connectivity. Furthermore, it is possible to dynamically modify the synaptic weights of each synapse by controlling the population of each of the spin wave modes by application of an external signal.
  • configuration unit 28 an antenna.
  • the configuration unit 28 is capable of revealing non-linearities at very low excitation amplitude, typically for a few microTeslas (-30 dBm) of alternating magnetic fields at 10 GHz.
  • the configuration unit 28 uses radio frequency signals in the time and frequency domains to selectively populate the specific excitation modes, the spectral density of these signals respecting the previous excitation conditions.
  • the interrogation unit 30 is either the same antenna as the configuration unit 28 or, as shown in Figure 4, another antenna.
  • YIG is particularly suitable here because the thin films can be nanostructured down to 300 nanometers (nm) without any negative impact on their magnetic properties.
  • YIG exhibits very low spin wave damping, with this damping reaching 8 x 10 -5 in the geometries described in this section.
  • the materials are metal alloys of the elements, such as NiFe, CoFeB or CoNi.
  • the materials are Heusler compounds like CoFeMnSi.
  • doped YIG or to substitute the yttrium Y of YIG with another atom, for example thulium (Tm) or bismuth (Bi).
  • Tm thulium
  • Bi bismuth
  • the neuromorphic circuit 10 is here a spintronic device. As such, it has the advantage of being compatible with CMOS circuits. OTHER SPECIAL EXAMPLES
  • photonic crystals which are optical cavities produced by nanostructuring are multimode elements.
  • the non-linear regime and the couplings between modes can be obtained by modulation of the dielectric constant by generation of electron-hole pairs by optical pumping.
  • metamaterials which are versatile systems where resonances can be defined geometrically. By using materials with low dissipation, operation in non-linear mode would be obtained.
  • a superconducting resonator is an example of such a metamaterial.
  • non-accessible modes can nevertheless be used to increase the number of degrees of freedom and thus considerably increase the depth of the neural network produced.
  • configuration 28 and interrogation 30 units could be an optical excitation unit, an electrical excitation unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Neurology (AREA)
  • Optics & Photonics (AREA)
  • Nonlinear Science (AREA)
  • Image Analysis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
EP23749094.1A 2022-08-02 2023-08-02 Neuromorphe schaltung zur physikalischen erzeugung eines neuronalen netzes und zugehöriges produktions- und inferenzverfahren Pending EP4566001A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2208016A FR3138719A1 (fr) 2022-08-02 2022-08-02 Circuit neuromorphique réalisant physiquement un réseau de neurones et procédé de réalisation et d'inférence associés
PCT/EP2023/071366 WO2024028371A1 (fr) 2022-08-02 2023-08-02 Circuit neuromorphique réalisant physiquement un réseau de neurones et procédé de réalisation et d'inférence associés

Publications (1)

Publication Number Publication Date
EP4566001A1 true EP4566001A1 (de) 2025-06-11

Family

ID=85569596

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23749094.1A Pending EP4566001A1 (de) 2022-08-02 2023-08-02 Neuromorphe schaltung zur physikalischen erzeugung eines neuronalen netzes und zugehöriges produktions- und inferenzverfahren

Country Status (3)

Country Link
EP (1) EP4566001A1 (de)
FR (1) FR3138719A1 (de)
WO (1) WO2024028371A1 (de)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3084505B1 (fr) * 2018-07-26 2021-09-10 Thales Sa Reseau de neurones comportant des resonateurs spintroniques

Also Published As

Publication number Publication date
WO2024028371A1 (fr) 2024-02-08
FR3138719A1 (fr) 2024-02-09

Similar Documents

Publication Publication Date Title
Papp et al. Nanoscale neural network using non-linear spin-wave interference
Marković et al. Physics for neuromorphic computing
EP2965269B1 (de) Künstliches neuron und memristor
FR3050050A1 (fr) Neurone artificiel
US12124946B2 (en) Bias scheme for single-device synaptic element
EP0454535B1 (de) Neuronales Klassifikationssystem and -verfahren
Guo et al. Photonic reservoir computing system for pattern recognition based on an array of four distributed feedback lasers
Alam et al. Deepqmlp: A scalable quantum-classical hybrid deep neural network architecture for classification
FR3119696A1 (fr) Circuit neuromorphique et procede d'entraînement associé
Riou et al. Reservoir computing leveraging the transient non-linear dynamics of spin-torque nano-oscillators
EP4566001A1 (de) Neuromorphe schaltung zur physikalischen erzeugung eines neuronalen netzes und zugehöriges produktions- und inferenzverfahren
Bradley et al. Pattern recognition using spiking antiferromagnetic neurons
Wang et al. Reconfigurable neuromorphic crossbars based on titanium oxide memristors
Tsirigotis et al. Unconventional integrated photonic accelerators for high-throughput convolutional neural networks
Hirose et al. Keynote speech: Information processing hardware, physical reservoir computing and complex-valued neural networks
Bennett et al. Exploiting the short-term to long-term plasticity transition in memristive nanodevice learning architectures
Leroux Artificial neural networks with radio-frequency spintronic nano-devices
Selesnick Neural waves and short-term memory in a neural net model
Laydevant Supervised learning in binary dynamical physical systems through energy minimization
Rybka et al. Image and audio data classification using bagging ensembles of spiking neural networks with memristive plasticity
Pan et al. LIGO Core-Collapse Supernova Detection using Convolution Neural Networks
EP4195061A1 (de) Algorithmusrechner aus speicher mit gemischten technologien
Mwamsojo Neuromorphic photonic systems for information processing
Sarantoglou et al. Reconfigurable Silicon Photonics Extreme Learning Machine with Random Non-linearities as Neural Processor and Physical Unclonable Function
FR3144686A1 (fr) Composant neuromorphique utilisant des textures magnétiques dénombrables

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20250131

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)