Skip to content

NoahSchiro/minigrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

minigrad

A minimal implementation of autograd and neural networks. Inspired by micrograd and pytorch. This library is meant to be educational and not meant for production code.

Install and dev work:

Install: go get github.com/NoahSchiro/minigrad@latest

Import in your code:

// You may only need one of these depending on your work
import "github.com/NoahSchiro/minigrad/tensor"
import "github.com/NoahSchiro/minigrad/ndarray"
import "github.com/NoahSchiro/minigrad/nn"

Dev work:

Install: git clone [email protected]:NoahSchiro/minigrad.git

Run tests: go test ./...

Example:

Currently, there is one example for the xor problem which can be found in cmd/xor.go

Features and roadmap:

  • Autograd engine
  • Linear layers
  • RNN, LSTM (targeting v2.1)
  • CNN (targeting v2.2)
  • Sigmoid
  • ReLU
  • SoftMax (targeting v2.1)
  • SGD optimizer
  • Adam optimizer (targeting v2.1)

SCC Report

The goal is to keep the library under 1k lines of code. This excludes test files.

Language Files Lines Blanks Comments Code
Go 12 1016 163 74 779

About

A minimal implementation of neural networks in Go

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages