Deep Learning Benchmarking



Project information
- Type: Software | Deep Learning
- Skills: Python | Jax | TensorFlow
- Models: MLP | CNN
- Code: https://github.com/WAT-ai/DL-framework-comparison
- Article: Deep Learning Framework Comparison
Building deep learning models in various frameworks access python and julia to compare performance and metrics between frameworks on standard datasets. To date I have created a Multi-Layer-Preception Model in Jax trained on the MNIST dataset and evaluted its metrics.