A minimal implementation of autograd and neural networks. Inspired by micrograd and pytorch. This library is meant to be educational and not meant for production code.
Install:
go get github.com/NoahSchiro/minigrad@latest
Import in your code:
// You may only need one of these depending on your work
import "github.com/NoahSchiro/minigrad/tensor"
import "github.com/NoahSchiro/minigrad/ndarray"
import "github.com/NoahSchiro/minigrad/nn"
Install:
git clone [email protected]:NoahSchiro/minigrad.git
Run tests:
go test ./...
Currently, there is one example for the xor problem which can be found in cmd/xor.go
- Autograd engine
- Linear layers
- RNN, LSTM (targeting v2.1)
- CNN (targeting v2.2)
- Sigmoid
- ReLU
- SoftMax (targeting v2.1)
- SGD optimizer
- Adam optimizer (targeting v2.1)
The goal is to keep the library under 1k lines of code. This excludes test files.
| Language | Files | Lines | Blanks | Comments | Code |
|---|---|---|---|---|---|
| Go | 12 | 1016 | 163 | 74 | 779 |