Learn How Industry-Standard Tools Are Built


Watch the video below to compare building a neural network in the Flamethrower framework vs. building the same network in PyTorch.

What You'll Build

Automatic Differentiation

You'll learn how automatic differentiation packages and related concepts such as computational graphs, reverse-mode differentiation, backpropagation, nodes, variables, and tensors work. You'll then implement your own automatic differentiation library, a fork of Autograd, completely from scratch!

Neural Network Components

On top of your automatic differentiation library, you'll create a variety of neural network components which will integrate with it. These will include neural network layers, batch normalization, dropout, regularizers, activation functions, and more, all from scratch!

Optimization Module

You'll implement an optimization module which will pull together your deep learning library. You'll learn how to code up gradient descent with and without momentum, implement loss functions, and create learning rate schedules. As before, this will all be done from scratch!

Meet Your Instructor


Hi, my name is Daniel, and I'm a machine learning researcher and engineer. I received my degree in applied mathematics from UC Berkeley, where my coursework focused heavily on Machine Learning and Natural Language Processing. I've since worked using machine learning in industry in a variety of verticals from marketing to healthcare.