Ending In:

$480

91% off

Courses

4

Lessons

111

Enrolled

2,030

Courses

4

Lessons

111

Enrolled

2,030

Deep Learning Prerequisites: Linear Regression in Python

$120 Value

Deep Learning Prerequisites: Logistic Regression in Python

$120 Value

Data Science: Deep Learning in Python

$120 Value

Data Science: Practical Deep Learning in Theano & TensorFlow

$120 Value

Access

Lifetime

Content

2 hours

Lessons

20

By Lazy Programmer | in Online Courses

Deep Learning is a set of powerful algorithms that are the force behind self-driving cars, image searching, voice recognition, and many, many more applications we consider decidedly "futuristic." One of the central foundations of deep learning is linear regression; using probability theory to gain deeper insight into the "line of best fit." This is the first step to building machines that, in effect, act like neurons in a neural network as they *learn* while they're fed more information. In this course, you'll start with the basics of building a linear regression module in Python, and progress into practical machine learning issues that will provide the foundations for an exploration of Deep Learning.

*Like what you're learning? Try out the **The Advanced Guide to Deep Learning and Artificial Intelligence* next.

- Access 20 lectures & 2 hours of content 24/7
- Use a 1-D linear regression to prove Moore's Law
- Learn how to create a machine learning model that can learn from multiple inputs
- Apply multi-dimensional linear regression to predict a patient's systolic blood pressure given their age & weight
- Discuss generalization, overfitting, train-test splits, & other issues that may arise while performing data analysis

The Lazy Programmer is a data scientist, big data engineer, and full stack software engineer. For his master's thesis he worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons to communicate with their family and caregivers.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

Details & Requirements

- Length of time users can access this course: lifetime
- Access options: web streaming, mobile streaming
- Certification of completion not included
- Redemption deadline: redeem your code within 30 days of purchase
- Experience level required: all levels, but you must have some knowledge of calculus, linear algebra, probability, Python, and Numpy
- All code for this course is available for download
*here*, in the directory linear_regression_class

Compatibility

- Internet required

- Introduction and Outline
- Introduction and Outline (3:36)
- What is machine learning? How does linear regression play a role? (5:13)
- Introduction to Moore's Law Problem (2:30)

- 1-D Linear Regression: Theory and Code
- Define the model in 1-D, derive the solution (14:52)
- Coding the 1-D solution in Python (7:38)
- Determine how good the model is - r-squared (5:51)
- R-squared in code (2:15)
- Demonstrating Moore's Law in Code (8:00)
- R-Squared Quiz

- Multiple linear regression and polynomial regression
- Define the multi-dimensional problem and derive the solution (17:07)
- How to solve multiple linear regression using only matrices (1:55)
- Coding the multi-dimensional solution in Python (7:29)
- Polynomial regression - extending linear regression (with Python code) (7:56)
- Predicting Systolic Blood Pressure from Age and Weight (5:45)
- R-Squared Quiz 2

- Practical machine learning issues
- Generalization error, train and test sets (2:49)
- Generalization and Overfitting Demonstration in Code (7:32)
- Categorical inputs (5:21)
- Brief overview of advanced linear regression and machine learning topics (5:15)
- Exercises, practice, and how to get good at this (3:54)
- One-hot encoding

- Appendix
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:22)

Access

Lifetime

Content

3 hours

Lessons

31

By Lazy Programmer | in Online Courses

Logistic regression is one of the most fundamental techniques used in machine learning, data science, and statistics, as it may be used to create a classification or labeling algorithm that quite resembles a biological neuron. Logistic regression units, by extension, are the basic bricks in the neural network, the central architecture in deep learning. In this course, you'll come to terms with logistic regression using practical, real-world examples to fully appreciate the vast applications of Deep Learning.

*Like what you're learning? Try out the **The Advanced Guide to Deep Learning and Artificial Intelligence* next.

- Access 31 lectures & 3 hours of content 24/7
- Code your own logistic regression module in Python
- Complete a course project that predicts user actions on a website given user data
- Use Deep Learning for facial expression recognition
- Understand how to make data-driven decisions

The Lazy Programmer is a data scientist, big data engineer, and full stack software engineer. For his master's thesis he worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons to communicate with their family and caregivers.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

Details & Requirements

- Length of time users can access this course: lifetime
- Access options: web streaming, mobile streaming
- Certification of completion not included
- Redemption deadline: redeem your code within 30 days of purchase
- Experience level required: all levels, but you must have some knowledge of calculus, linear algebra, probability, Python, and Numpy
- All code for this course is available for download
*here*, in the directory logistic_regression_class

Compatibility

- Internet required

- Introductgion and Outline
- Introduction and Outline (4:02)
- Review of the classification problem (2:53)
- Introduction to the E-Commerce Course Project (8:53)
- What can classification be used for?

- Basics: What is linear classification? What's the relation to neural networks?
- Linear Classification (4:59)
- Biological inspiration - the neuron (3:36)
- How do we calculate the output of a neuron / logistic classifier? - Theory (4:18)
- How do we calculate the output of a neuron / logistic classifier? - Code (4:30)
- E-Commerce Course Project: Pre-Processing the Data (5:24)
- E-Commerce Course Project: Making Predictions (3:01)
- Feedforward

- Solving for the optimal weights
- A closed-form solution to the Bayes classifier (5:59)
- What do all these symbols mean? X, Y, N, D, L, J, P(Y=1|X), etc. (3:38)
- The cross-entropy error function - Theory (2:46)
- The cross-entropy error function - Code (4:53)
- Visualizing the linear discriminant / Bayes classifier / Gaussian clouds (2:28)
- Can we use squared error instead of cross-entropy for the error if we're doing classification?
- Maximizing the likelihood (6:34)
- Updating the weights using gradient descent - Theory (6:20)
- Updating the weights using gradient descent - Code (3:09)
- E-Commerce Course Project: Training the Logistic Model (6:47)
- Softmax

- Practical concerns
- L2 Regularization - Theory (8:38)
- Regularization - Code (1:43)
- The donut problem (10:01)
- The XOR Problem (6:12)
- Neural Networks

- Checkpoint and applications: How to make sure you know your stuff
- Sentiment Analysis (5:13)
- Exercises + how to get good at this (2:48)

- Project: Facial Expression Recognition
- Facial Expression Recognition Problem Description (12:21)
- The class imbalance problem (6:01)
- Utilities walkthrough (5:45)
- Facial Expression Recognition in Code (10:41)

- Appendix
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:22)
- Gradient Descent Tutorial (4:30)

Access

Lifetime

Content

4 hours

Lessons

37

By Lazy Programmer | in Online Courses

Artificial neural networks are the architecture that make Apple's Siri recognize your voice, Tesla's self-driving cars know where to turn, Google Translate learn new languages, and so many more technological features you have quite possibly taken for granted. The data science that unites all of them is Deep Learning. In this course, you'll build your very first neural network, going beyond basic models to build networks that automatically learn features.

*Like what you're learning? Try out the **The Advanced Guide to Deep Learning and Artificial Intelligence* next.

- Access 37 lectures & 4 hours of content 24/7
- Extend the binary classification model to multiple classes uing the softmax function
- Code the important training method, backpropagation, in Numpy
- Implement a neural network using Google's TensorFlow library
- Predict user actions on a website given user data using a neural network
- Use Deep Learning for facial expression recognition
- Learn some of the newest development in neural networks

The Lazy Programmer is a data scientist, big data engineer, and full stack software engineer. For his master's thesis he worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons to communicate with their family and caregivers.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

Details & Requirements

- Length of time users can access this course: lifetime
- Access options: web streaming, mobile streaming
- Certification of completion not included
- Redemption deadline: redeem your code within 30 days of purchase
- Experience level required: intermediate, but you must have some knowledge of calculus, linear algebra, probability, Python, and Numpy
- All code for this course is available for download
*here*, in the directory ann_class

Compatibility

- Internet required

- What is a neural network?
- Introduction and Outline (3:45)
- Neural Networks with No Math (4:20)
- Where does this course fit into your deep learning studies? (4:57)
- Introduction to the E-Commerce Course Project (8:53)

- Classifying more than 2 things at a time
- From Logistic Regression to Neural Networks (5:12)
- Softmax (2:54)
- Sigmoid vs. Softmax (1:30)
- Where to get the code for this course (1:30)
- Softmax in Code (3:39)
- Building an entire feedforward neural network in Python (6:23)
- E-Commerce Course Project: Pre-Processing the Data (5:24)
- E-Commerce Course Project: Making Predictions (3:55)
- Absence of non-linearities

- Training a neural network
- Backpropagation Intro (11:50)
- Backpropagation - what does the weight update depend on? (4:47)
- Backpropagation - recursiveness (4:38)
- Backpropagation in Code (17:07)
- The WRONG Way to Learn Backpropagation (3:52)
- E-Commerce Course Project: Training Logistic Regression with Softmax (8:11)
- E-Commerce Course Project: Training a Neural Network (6:19)
- Backpropagation for binary output

- Practical Machine Learning
- Donut and XOR Review (1:06)
- Donut and XOR Revisited (4:21)
- Common nonlinearities and their derivatives (1:26)
- Hyperparameters and Cross-Validation (4:11)
- Manually Choosing Learning Rate and Regularization Penalty (4:08)

- TensorFlow, exercises, practice, and what to learn next
- TensorFlow plug-and-play example (7:31)
- Visualizing what a neural network has learned using TensorFlow Playground (11:35)
- Where to go from here (3:41)
- You know more than you think you know (4:52)
- How to get good at deep learning + exercises (5:07)

- Project: Facial Expression Recognition
- Facial Expression Recognition Problem Description (12:21)
- The class imbalance problem
- Utilities walkthrough (5:45)
- Facial Expression Recognition in Code (Binary / Sigmoid) (12:13)
- Facial Expression Recognition in Code (Logistic Regression Softmax) (8:57)
- Facial Expression Recognition in Code (ANN Softmax) (10:44)

- Appendix
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:22)
- Gradient Descent Tutorial (4:30)

Access

Lifetime

Content

3 hours

Lessons

23

By Lazy Programmer | in Online Courses

The applications of Deep Learning are many, and constantly growing, just like the neural networks that it supports. In this course, you'll delve into advanced concepts of Deep Learning, starting with the basics of TensorFlow and Theano, understanding how to build neural networks with these popular tools. Using these tools, you'll learn how to build and understand a neural network, knowing exactly how to visualize what is happening within a model as it learns.

*Like what you're learning? Try out the **The Advanced Guide to Deep Learning and Artificial Intelligence* next.

- Access 23 lectures & 3 hours of programming 24/7
- Discover batch & stochastic gradient descent, two techniques that allow you to train on a small sample of data at each iteration, greatly speeding up training time
- Discuss how momentum can carry you through local minima
- Learn adaptive learning rate techniques like AdaGrad & RMSprop
- Explore dropout regularization & other modern neural network techniques
- Understand the variables & expressions of TensorFlow & Theano
- Set up a GPU-instance on AWS & compare the speed of CPU vs GPU for training a deep neural network
- Look at the MNIST dataset & compare against known benchmarks

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

Details & Requirements

- Length of time users can access this course: lifetime
- Access options: web streaming, mobile streaming
- Certification of completion not included
- Redemption deadline: redeem your code within 30 days of purchase
- Experience level required: all levels, but you must have some knowledge of calculus, linear algebra, probability, Python, and Numpy
- All code for this course is available for download
*here*, in the directory ann_class2

Compatibility

- Internet required

- Outline, the MNIST dataset, and Linear (Logistic Regression) Benchmark
- Outline - what did you learn previously, and what will you learn in this course? (2:35)
- Where to get the MNIST dataset and Establishing a Linear Benchmark (4:31)

- Gradient Descent: Full vs Batch vs Stochastic
- What are full, batch, and stochastic gradient descent? (2:45)
- Full vs Batch vs Stochastic Gradient Descent in code (5:38)

- Momentum and adaptive learning rates
- Momentum (1:56)
- Code for training a neural network using momentum (6:41)
- Variable and adaptive learning rates (3:13)
- Constant learning rate vs. RMSProp in Code (4:05)
- Hyperparameter Optimization: Cross-validation, Grid Search, and Random Search (3:19)

- Theano
- Theano Basics: Variables, Functions, Expressions, Optimization (7:47)
- Building a neural network in Theano (9:17)

- TensorFlow
- TensorFlow Basics: Variables, Functions, Expressions, Optimization (7:27)
- Building a neural network in TensorFlow (9:43)

- Modern Regularization Techniques
- Dropout Regularization (11:38)

- GPU Speedup and Homework
- Setting up a GPU Instance on Amazon Web Services (7:06)
- Exercises and Concepts Still to be Covered (2:13)

- Project: Facial Expression Recognition
- Facial Expression Recognition Problem Description (12:21)
- The class imbalance problem (6:01)
- Utilities walkthrough (5:45)
- Class-Based ANN in Theano (19:09)
- Class-Based ANN in TensorFlow (15:28)

- Appendix
- Manually Choosing Learning Rate and Regularization Penalty (4:08)
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:22)

- Instant digital redemption