[go: up one dir, main page]

Showing 52 open source projects for "backpropagation"

View related business solutions
  • Gen AI apps are built with MongoDB Atlas Icon
    Gen AI apps are built with MongoDB Atlas

    The database for AI-powered applications.

    MongoDB Atlas is the developer-friendly database used to build, scale, and run gen AI and LLM-powered apps—without needing a separate vector database. Atlas offers built-in vector search, global availability across 115+ regions, and flexible document modeling. Start building AI apps faster, all in one place.
    Start Free
  • MaintainX is the world-leading mobile-first workflow management platform for industrial and frontline workers. Icon
    MaintainX is the world-leading mobile-first workflow management platform for industrial and frontline workers.

    Trusted by Operational Leaders Across the Globe

    Your day-to-day maintenance tasks, simplified. MaintainX eliminates the paperwork, so you can spend less time on your clipboard and more time getting things done.
    Learn More
  • 1
    Hivemind

    Hivemind

    Decentralized deep learning in PyTorch. Built to train models

    ...Its intended usage is training one large model on hundreds of computers from different universities, companies, and volunteers. Distributed training without a master node: Distributed Hash Table allows connecting computers in a decentralized network. Fault-tolerant backpropagation: forward and backward passes succeed even if some nodes are unresponsive or take too long to respond. Decentralized parameter averaging: iteratively aggregate updates from multiple workers without the need to synchronize across the entire network. Train neural networks of arbitrary size: parts of their layers are distributed across the participants with the Decentralized Mixture-of-Experts. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 2
    Autograd

    Autograd

    Efficiently computes derivatives of numpy code

    ...It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 3
    Karpathy-Inspired Claude Code Guidelines

    Karpathy-Inspired Claude Code Guidelines

    A single CLAUDE.md file to improve Claude Code behavior

    ...The project organizes a progressive path through exercises, notebooks, code examples, and practical mini-projects that echo Karpathy’s approach to “learning by doing,” where students build core concepts from first principles rather than consuming superficial abstractions. It covers topics like implementing backpropagation from scratch, understanding convolutional and recurrent networks, building simple training loops, and exploring real datasets with hands-on code. This collection makes abstract theoretical ideas concrete by walking learners through real code and tangible outcomes, helping demystify parts of machine learning that often feel opaque in purely textbook settings.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 4
    JAX

    JAX

    Composable transformations of Python+NumPy programs

    ...It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order. What’s new is that JAX uses XLA to compile and run your NumPy programs on GPUs and TPUs. Compilation happens under the hood by default, with library calls getting just-in-time compiled and executed. But JAX also lets you just-in-time compile your own Python functions into XLA-optimized kernels using a one-function API, jit. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Cloud-Based Software Licensing - Zentitle by Nalpeiron Icon
    Cloud-Based Software Licensing - Zentitle by Nalpeiron

    The #1 Software Licensing Solution. Release new Software License Models fast with no engineering. Increase software sales and drive up revenues.

    1000’s software companies have used Zentitle to launch new software products fast and control their entitlements easily - many going from startup to IPO on our platform. Our software monetization infrastructure allows you to easily build or
    Learn More
  • 5

    irixjoker-ai-utils

    An AI library e.g. for IRIX emulating

    Includes OOP Neural Nets (Backpropagation and Feed-Forward for example)
    Downloads: 1 This Week
    Last Update:
    See Project
  • 6
    PyTorch Implementation of SDE Solvers

    PyTorch Implementation of SDE Solvers

    Differentiable SDE solvers with GPU support and efficient sensitivity

    This library provides stochastic differential equation (SDE) solvers with GPU support and efficient backpropagation. examples/demo.ipynb gives a short guide on how to solve SDEs, including subtle points such as fixing the randomness in the solver and the choice of noise types. examples/latent_sde.py learns a latent stochastic differential equation, as in Section 5 of [1]. The example fits an SDE to data, whilst regularizing it to be like an Ornstein-Uhlenbeck prior process.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    PyTorch Transfer-Learning-Library

    PyTorch Transfer-Learning-Library

    Transfer Learning Library for Domain Adaptation, Task Adaptation, etc.

    TLlib is an open-source and well-documented library for Transfer Learning. It is based on pure PyTorch with high performance and friendly API. Our code is pythonic, and the design is consistent with torchvision. You can easily develop new algorithms or readily apply existing algorithms. We appreciate all contributions. If you are planning to contribute back bug-fixes, please do so without any further discussion. If you plan to contribute new features, utility functions or extensions, please...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Deep Learning course

    Deep Learning course

    Slides and Jupyter notebooks for the Deep Learning lectures

    Slides and Jupyter notebooks for the Deep Learning lectures at Master Year 2 Data Science from Institut Polytechnique de Paris. This course is being taught at as part of Master Year 2 Data Science IP-Paris. Note: press "P" to display the presenter's notes that include some comments and additional references. This lecture is built and maintained by Olivier Grisel and Charles Ollion.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Keras TCN

    Keras TCN

    Keras Temporal Convolutional Network

    ...MNIST, Adding Problem, Copy Memory, Word-level PTB...). Parallelism (convolutional layers), flexible receptive field size (possible to specify how far the model can see), stable gradients (backpropagation through time, vanishing gradients). The usual way is to import the TCN layer and use it inside a Keras model. The receptive field is defined as the maximum number of steps back in time from current sample at time T, that a filter from (block, layer, stack, TCN) can hit (effective history) + 1. The receptive field of the TCN can be calculated. ...
    Downloads: 1 This Week
    Last Update:
    See Project
  • Project Planning and Management Software | Planview Icon
    Project Planning and Management Software | Planview

    For Enterprise PMOs

    Planview® ProjectAdvantage (formerly Sciforma) is an enterprise-centric project and portfolio management (PPM) software designed to enable change, drive innovation, and lead in a company's digital transformation. With ProjectAdvantage, teams can strategically track and monitor project data in order to make relevant decisions. It offers multiple features focused on strategic management, functional management, and execution management. A highly scalable and cost-effective solution, ProjectAdvantage is available in various deployment models.
    Learn More
  • 10
    Java Neural Network Framework Neuroph
    Neuroph is lightweight Java Neural Network Framework which can be used to develop common neural network architectures. Small number of basic classes which correspond to basic NN concepts, and GUI editor makes it easy to learn and use.
    Downloads: 71 This Week
    Last Update:
    See Project
  • 11

    Multidimensional Neural Network

    Multidimensional Neural Network

    In Fully Connected Backpropagation Neural Networks, with many layers and many neurons in layers there is problem known as Gradient Vanishing Problem. Solution to lower its magnitude is to use Not Fully Connected Neural Network, when that is the case than with which neurons from previous layer neuron is connected has to be considered. The simplest solution would be to use Cartesian Coordinate System, and treat layers as one dimensional lines or two dimensional rectangles or three, four, five ... dimensional cuboids. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    Verify authenticity of handwritten signatures through digital image processing and neural networks.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13

    NARX simulator with neural networks

    A simulator for NARX ( Nonlinear AutoRegressive with eXogenous inputs)

    This projects aims at creating a simulator for the NARX (Nonlinear AutoRegressive with eXogenous inputs ) architecture with neural networks. The system can fallback to MLP ( multi layer perceptron ), TDNN ( time delay neural network ), BPTT ( backpropagation through time ) and a full NARX architecture. The system is intended to be used as a time series forecaster for educational purposes. This projects is my personal master thesis developed at the Master of Artificial Intelligence at Universitat Polytecnica de Catalunya, Barcelona The project presents artificial generated data but also real data tests from temperature, weather, economy. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Multiple Back-Propagation (with CUDA)

    Multiple Back-Propagation (with CUDA)

    Open source software for training neural networks

    Multiple Back-Propagation is an open source software application for training neural networks with the backpropagation and the multiple back propagation algorithms. Currently this project is also hosted at http://code.google.com/p/multiplebackpropagation
    Downloads: 1 This Week
    Last Update:
    See Project
  • 15
    CRFasRNN

    CRFasRNN

    Semantic image segmentation method described in the ICCV 2015 paper

    CRF-RNN is a deep neural architecture that integrates fully connected Conditional Random Fields (CRFs) with Convolutional Neural Networks (CNNs) by reformulating mean-field CRF inference as a Recurrent Neural Network. This fusion enables end-to-end training via backpropagation for semantic image segmentation tasks, eliminating the need for separate, offline post-processing steps. Our work allows computers to recognize objects in images, what is distinctive about our work is that we also recover the 2D outline of objects. Currently we have trained this model to recognize 20 classes. This software allows you to test our algorithm on your own images – have a try and see if you can fool it, if you get some good examples you can send them to us. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    DeepLearnToolbox

    DeepLearnToolbox

    Matlab/Octave toolbox for deep learning

    DeepLearnToolbox is a MATLAB / Octave toolbox for prototyping deep learning models. It provides implementations of feedforward neural networks, convolutional neural networks (CNNs), deep belief networks (DBNs), stacked autoencoders, convolutional autoencoders, and more. The toolbox includes example scripts for each method, enabling users to quickly experiment with architectures, training, and inference workflows. Although it's been flagged as deprecated and no longer actively maintained, it...
    Downloads: 3 This Week
    Last Update:
    See Project
  • 17

    C/C++ Neural Networks

    A C API for working with Neural Networks

    A free C library for working with FeedForward Neural Networks, Neurons and Perceptrons
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    Neural Libs

    Neural Libs

    Neural network library for developers

    This project includes the implementation of a neural network MLP, RBF, SOM and Hopfield networks in several popular programming languages. The project also includes examples of the use of neural networks as function approximation and time series prediction. Includes a special program makes it easy to test neural network based on training data and the optimization of the network.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    LensOSX

    LensOSX

    LensOSX: the light, efficient network simulator

    Lens is the light, efficient network simulator, written by Doug Rohde. LensOSX is a native MacOSX port of Lens that runs on MacOSX 10.5 or higher, created by Harm Brouwer, Daniel de Kok and Hartmut Fitz.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    NDN Backprop Neural Net Trainer implements the backpropagation functionality subset of the open source NeuronDotNet object library in a generic user friendly application.
    Downloads: 3 This Week
    Last Update:
    See Project
  • 21
    OSXtlearn

    OSXtlearn

    OSXtlearn: tlearn for MacOSX

    tlearn is a backpropagation neural network simulator, written by Jeff Elman. xtlearn is a version of tlearn for the X Window System. OSXtlearn is xtlearn wrapped in a MacOSX application bundle that runs ons MacOSX 10.5 or higher and that requires XQuartz. OSXtlearn is created by Harm Brouwer.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    eANN

    eANN

    eANN is an implementation of several kind of neural networks.

    eANN is an implementation of several kind of neural networks written with the intention of providing a (hopefully) easy to use, and easy to modify, OOP source code. It is possible to have several different sized networks running simultaneously, each functioning independently of the others or acting as inputs between them. It also easy to modify the structure so that neurons (or even whole layers) can be created/pruned during simulation allowing dynamic expansion/contraction of the network.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23

    BPNNet

    Backpropagation neural network simulator

    Downloads: 0 This Week
    Last Update:
    See Project
  • 24

    DropoutMLP

    A Multi-Layer Perceptron with Dropout

    This project was derived from the following project on CCodeChamp by MR CODER. http://www.ccodechamp.com/c-program-of-multilayer-perceptron-net-using-backpropagation/ I rewrote the project in C++ and and made it more object oriented. Then, I added the capability to use dropout on the hidden layers as specified by Geoffrey Hinton et. al. in "Improving neural networks by preventing co-adaptation of feature detectors" (2012). http://www.cs.toronto.edu/~hinton/absps/dropout.pdf I found that it performed much worse with dropout. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    cCNN

    cCNN

    A fast implementation of LeCun's convolutional neural network

    Code of this library is partialy based on myCNN MATLAB class written by Nikolay Chemurin.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • 3
  • Next