Rust port of Karpathy's micrograd & associated stuff.
-
Updated
Jul 18, 2024 - Rust
Rust port of Karpathy's micrograd & associated stuff.
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
A differentiable underwater vehicle dynamics in body and ned(euler & quaternion).
automatic differentiation made easier for C++
toydl: toy deep learning algorithms implementation, backend with self implement toy torch
🚢 Portable development environment for Enzyme
Forward mode automatic differentiation for Fortran
Dualitic is a Python package for forward mode automatic differentiation using dual numbers.
Drop-in autodiff for NumPy.
Lightweight automatic differentiation and error propagation library
A minimalist neural networks library built on a tiny autograd engine
A tiny autograd library made for educational purposes.
Deep learning in Rust, with shape checked tensors and neural networks
[wip] Lightweight Automatic Differentiation & DeepLearning Framework implemented in pure Julia.
Simple neural network and automatic differentiation implementation
A brief (and inaccurate) history of derivatives, with a brief (and incomplete) Python implementation
micrograd (smol autodiff lib by @karpathy) ported into Haskell and Swift
A simple and highly extensible Computational graph library written in C++ with the support of auto diff.
Yet another tensor automatic differentiation framework
My implementation of Andrej Kaparthy's Micrograd library for back propagation and simple neural net training
Add a description, image, and links to the autodifferentiation topic page so that developers can more easily learn about it.
To associate your repository with the autodifferentiation topic, visit your repo's landing page and select "manage topics."