GradEngine (crate: grad_engine) is a Rust based implementation of an automatic differentiation engine coupled with a computational graph visualizer.
The project aims to implement the core components that power modern neural network training while also providing users a special viewpoint into how different components play together to form a coherent system.
For a quick visualization of the computation graph, you can run the small exmaple to generate a small computation graph
git clone https://github.com/Hy-LeTuan/grad-engine.git
cd grad-engine
# -- export will export graph at /output
cargo run --example small -- export
# or try out the example from my blog
cargo run --example blog -- export
# create animation
cd animation
manim -pqh main.py CreateAcyclicGraphwhich will produce and play video of the full creation and execution process of the computation graph.
Greatly inspired by PyTorch’s design, GradEngine follows a dynamic computation graph generation design, and while simplified compared to production systems, it shows the essential mechanics of backpropagation and dynamic graph construction through the usage of Computatoin Nodes.
GradEngine started as a part of my deeper dive into the internals of modern Deep Learning frameworks, aiming to build both understanding and tooling by learning from popular highly optimized frameworks.
If you are interested in how the engine is laid out from scratch, GradEngine will be accompanied with a series of blogs explaining the internal mathematics of automatic differentiation engine as well as the engineering choices I've made. This blog series will be avaialble on my personal portfolio.
The project has 2 separate components, which requires 2 different sets of prerequisites and installation process. The components and by extend the prerequisites, can be installed and run indepently of each other.
The grad_engine crate requires:
- Rust edition 2024 (rustc 1.85.0 or later)
- Cargo edition 2024 (cargo 1.85.0 or later)
The Visualizer located in animation requires:
- Python 3.11 or later
- Manim 0.19 or later if you are installing it separately from
pip Manimdependencies which you can refer toManim's guidelines to install:- CMake
- pkgconfig
- pangocairo
git clone https://github.com/Hy-LeTuan/grad-engine
cd grad-enginecargo build
# to test functionalities of all backward nodes
cargo testBefore installing animationI recommend creating a virtual environment first. In local environment, I used conda to create the virtual environment, but manim suggests using uv, which you can install here.
# manim is included in requirements.txt
cd animation
pip install -r requirements.txt
# if you're using uv
uv pip install -r requirements.txtThe project has 2 components, offering you vastly different features:
- The
grad_enginelibrary, located in/srcgives you access to tensor creation, tensor operations, computation graph execution and computation graph export - The visualizer, located in
/animation, allows you to create animation on how the computation graph is created and executed based onJSONexports of the computation graphs.
I have built out some example computation graph in /examples, which you can run with
# cargo run --example [example_name]
cargo run --example small
cargo run --example large
# export args to export computation graph to /output
cargo run --example small -- export
cargo run --example large -- exportand the computation graph will be exported to their signified location. You will also see a terminal-based visualization of the graph displaying right after running these commands.
Visualize these examples through
cd animation
manim -pqh main.py CreateAcyclicGraph # for high quality render
manim -pql main.py CreateAcyclicGraph # for low quality renderCreate any tensor through the tensor! macro
// requires_grad will determine whether tensor is added to the computation graph or not
// creating a 1D tensor
let tensor = tensor!(1, 2, 3, 4, 5; requires_grad=true);
// creating a 2D tensor
let tensor = tensor!([1, 2, 3], [4, 5, 6]; requires_grad=false);
// creating a 3D tensor
let tensor = tensor!([[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]; requires_grad=true);GradEngine offers a variety of tensor operations, which can either be invoked directly through operations like +, -, through functions like matmul or through methods like ln.
use grad_engine::tensor;
use grad_engine::ops::public_ops::matmul::matmul;
let x1 = tensor!(1, 2, 3, 4, 5; requires_grad=true);
let x2 = tensor!(0.1, 0.2, 0.3, 0.4, 0.5);
// references have to be used if you don't want z to own x1 and x2
let z1 = &x1 + &x2.ln();
// matrix multiplication
let z2 = matmul(&x1, &x2);Any tensor that is not a leaf tensor can invoke the backward() method to start the backpropagation process. The computation graph will be computed in this process, and will retain after backward() is complete for exportation.
use grad_engine::tensor;
use grad_engine::ops::public_ops::matmul::matmul;
use grad_engine::tensor_core::tensor::Tensor;
let x1 = tensor!(1, 2, 3, 4, 5; requires_grad=true);
let x2 = tensor!(0.1, 0.2, 0.3, 0.4, 0.5);
// matrix multiplication
let z = matmul(&x1, &x2);
// second parameter set to `true` to retain computation graph
z.backward(Tensor::ones_like(&z, None), true);
// export the graph through calling export_graph_acyclic, the graph will be stored in /output
export_graph_acyclic(&z, None);I made 2 modes of visualizing the comptuation graph, one compact visualization which is a top down visualization of the graph directly in the terminal, and one full visualization which involves creating the animation in animation.
Visualize directly in the command line
use grad_engine::tensor;
use grad_engine::graph::visualize::visualizer::Visualizer;
use grad_engine::graph::visualize::visualizer::VisualizerTrait;
...
z.backward(Tensor::ones_like(&z, None), true);
Visualizer::visualize_graph(&z);and in the case of the large.rs example, the visualized graph would look something like this
## Backward computation graph ##
------------------------------
AddBackward [ 2 child nodes ]
├── LnBackward [ 1 child nodes ]
└── SubBackward [ 2 child nodes ]
├── MatmulBackward [ 2 child nodes ]
├── AddBackward [ 1 child nodes ]
└── GradAccum [ Gradient accumulation ]
└── SubBackward [ 2 child nodes ]
├── GradAccum [ Gradient accumulation ]
└── GradAccum [ Gradient accumulation ]
└── GradAccum [ Gradient accumulation ]
└── ExpBackward [ 1 child nodes ]
└── GradAccum [ Gradient accumulation ]
------------------------------Creating the animation of the graph is a bit more complicated, as you'd have to use the manim interface.
cd animation
# CreateAcyclicGraph is the scene name
manim -pqh main.py CreateAcyclicGraphwill read the graph stored at output/ and compute the animation. If your graph is stored somewhere else, you'd have to modify it in animation/graph/parse_utils.py/. The animation will be stored in animation/media/videos/main/1080p60 and you can then play it on any mp4 player.
Animation from running large example:


