Project DNN From Scratch
sourceThis project implements a deep neural network (DNN) from scratch in Rust, focusing on two key experiments: image classification with the MNIST dataset and signal strength-based predictions using an RSSI dataset. The project demonstrates building and training a neural network without relying on external machine learning libraries.
ยง ๐ Features
- Custom Neural Network Implementation: Build and train DNNs using only Rust libraries and custom modules.
- Examples for Two Experiments:
- MNIST Dataset: Handwritten digit classification.
- RSSI Dataset: Analysis and predictions based on signal strength data.
- Modular Codebase: Cleanly separated concerns such as activation functions, loss computation, and neural network architecture.
- Visualization: Generate reports and plots to visualize experiment results.
ยง ๐ Directory Structure
dnn-from-scratch/
โโโ README.md # Project overview and instructions
โโโ Cargo.toml # Project dependencies and configuration
โโโ LICENSE # License information
โโโ assets/ # Datasets and auxiliary data
โ โโโ mnist/
โ โ โโโ x_test.npy # MNIST test images
โ โ โโโ x_train.npy # MNIST training images
โ โ โโโ y_test.npy # MNIST test labels
โ โ โโโ y_train.npy # MNIST training labels
โ โโโ rssi/
โ โโโ rssi-dataset.csv # RSSI dataset
โโโ dnn_from_scratch/ # Core library for the neural network
โ โโโ Cargo.toml # Library-specific dependencies
โ โโโ src/
โ โโโ activation.rs # Activation functions
โ โโโ fully_connected.rs # Fully connected layer module
โ โโโ lib.rs # Entry point for the library
โ โโโ loss.rs # Loss functions
โ โโโ neural_network.rs # Neural network definition
โ โโโ optimizer.rs # Optimizer implementations (e.g., Adam)
โ โโโ report.rs # Reporting and result output
โ โโโ utils.rs # Utility functions for regression/classification
โ โโโ weights_initializer.rs # Weight initialization strategies
โโโ src/ # Main application for experiments
โโโ main.rs # Entry point for the executable
โโโ mnist_experiment/ # MNIST Experiment-related modules
โ โโโ dataset_setup.rs # MNIST dataset preprocessing
โ โโโ mod.rs # MNIST Experiment module entry point
โ โโโ plot.rs # Plotting results for MNIST Experiment
โโโ rssi_experiment/ # RSSI Experiment-related modules
โโโ dataset_setup.rs # RSSI dataset preprocessing
โโโ mod.rs # RSSI Experiment module entry point
โโโ plot.rs # Plotting results for RSSI Experiment
ยง ๐ ๏ธ Getting Started
To build and run the experiments, follow these steps:
Prerequisites
- Install Rust: rust-lang.org
Clone the Repository
git clone https://github.com/akaTsunemori/dnn-from-scratch.git
cd dnn-from-scratch
Build the Project
cargo build --release
Run Experiments
cargo run --release
ยง ๐ Documentation
View detailed API documentation at this page, or generate it locally:
cargo doc --release --workspace --no-deps --target-dir=docs
ยง ๐งช Datasets
-
MNIST Dataset:
-
Stored in
assets/mnist/
. -
Preprocessed as
.npy
files for seamless integration.
-
Stored in
-
RSSI Dataset:
-
Found in
assets/rssi/rssi-dataset.csv
. - Contains signal strength data and coordinates (X, Y) for analysis.
-
Found in
ยง ๐ Results & Reporting
Each experiment generates reports and plots showcasing:
- Training history.
- Model performance metrics (e.g., accuracy for MNIST, CDF of RMSE for RSSI).
Plots and reports are saved in the
output/
folder. You can check a preview
of the expected results below by clicking to reveal
the contents.
MNIST Experiment
Preview of training history:
Epoch 1/100 | Train: Loss 2.0665, Accuracy 0.1013 | Test: Loss 1.9560, Accuracy 0.3710
(...)
Epoch 100/100 | Train: Loss 0.1369, Accuracy 0.9609 | Test: Loss 0.1517, Accuracy 0.9519
Output plot:

RSSI Experiment
Preview of training history:
Epoch 1/2500 | Train: Loss 18046.3999, Error 134.3368 | Test: Loss 21310.2716, Error 145.9803
(...)
Epoch 2500/2500 | Train: Loss 5.5695, Error 2.3599 | Test: Loss 5.8578, Error 2.4202
Output plot:

ยง ๐ License
Licensed under the MIT License.
ยง ๐ค Contributing
Contributions are welcome! Feel free to fork the repository and submit a pull request.
- Fork the repo.
-
Create a new branch (
git checkout -b feature-name
). -
Commit your changes (
git commit -m "Add feature"
). -
Push to the branch (
git push origin feature-name
). - Open a pull request.
ยง ๐ง Contact
Create an issue for any inquiries or support.
Enjoy building your neural networks from scratch! ๐