In this course, you will learn the fundamental principles of deep learning by building your very own deep learning library, similar to PyTorch, completely from scratch. This course includes all three courses in the Flamethrower Core series, for 33% off the price of buying all three separately. You'll start by learning how automatic differentiation works, and you'll build your own automatic differentiation engine based on the popular autograd library. From there, you'll be introduced to different neural network layer types, regularization strategies, loss functions, and more. As each of these concepts are introduced, you'll implement them in your library and test them out on real world data. We'll integrate material from statistics, numerical analysis, and Bayesian statistics to give you the most well rounded perspective of deep learning available in any course. By the time the course is finished, you'll have a solid grounding in both the theoretical and applied aspects of the field, and you'll have your own deep learning library to apply to problems of your own creation. View the entire course syllabus below, along with preview lessons. Be sure to click the drop down arrow to see the syllabus in its entirety.

Course Curriculum

  Course Introduction
Available in days
days after you enroll
  Why Do We Care So Much About Neural Networks?
Available in days
days after you enroll
  Automatic Differentiation
Available in days
days after you enroll
  Neural Network Module
Available in days
days after you enroll
  The Optimization Module
Available in days
days after you enroll
  Pulling it all Together
Available in days
days after you enroll
  Projects
Available in days
days after you enroll
  Resources, References, and Course Credits
Available in days
days after you enroll

Select a pricing plan and sign up