Skip to content

Yathu2007/MicroTensor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MicroTensor

A reverse-mode automatic differentiation engine built from scratch in C++, inspired by Andrej Karpathy's micrograd. But I extended with N-dimensional tensor support!

Trained a 2-layer MLP on XOR using only this engine. No ML frameworks used.

Demo

epoch 0  loss: 0.488825
epoch 100  loss: 0.397794
epoch 200  loss: 0.330343
epoch 300  loss: 0.281766
epoch 400  loss: 0.234652
epoch 500  loss: 0.177522
epoch 600  loss: 0.113335
epoch 700  loss: 0.0671726
epoch 800  loss: 0.0403612
epoch 900  loss: 0.0224028
epoch 1000  loss: 0.0111374

Final predictions:
Input [0,0] -> 0  (expected 0)
Input [0,1] -> 0.838601  (expected 1)
Input [1,0] -> 0.986735  (expected 1)
Input [1,1] -> 0.134124  (expected 0)

What it implements

  • N-dimensional tensors backed by a flat std::vector<float> with shape and stride tracking
  • Dynamic computation graph built during the forward pass
  • Reverse-mode automatic differentiation (backpropagation)
  • Elementwise add and multiply with backward pass
  • Matrix multiplication with backward pass
  • ReLU activation with backward pass
  • MSE loss
  • SGD optimizer

Project structure

MicroTensor/
├── tensor.h / tensor.cpp    # Tensor class, ops, backward traversal
├── sgd.h / sgd.cpp          # SGD optimizer
├── main.cpp                 # XOR training demo
└── README.md

Building

g++ main.cpp tensor.cpp sgd.cpp -o main
./main

Roadmap

  • N-dimensional tensor with strides
  • Elementwise add, multiply
  • Matrix multiplication
  • ReLU activation
  • MSE loss
  • Backward pass with topological sort
  • SGD optimizer
  • XOR demo
  • Bias addition with broadcasting
  • Sigmoid activation
  • Train on a larger dataset (MNIST)
  • CUDA backend

References

About

A reverse-mode automatic differentiation engine built from scratch in C++

Topics

Resources

Stars

Watchers

Forks

Contributors

Languages