Become a Machine Learning Master.
Starting with the basics, ending at mastery. We've got you covered every step of the way.

Get a Certificate That Means Something.

The Best Learning Environment.

And finally, our Course Material.
-
Chapter 1: Introduction to Machine Learning
Get introduced to Machine Learning by learning about the fundamentals of ML, Linear Regression, and more in Scikit-Learn.
π»1: Introduction to Machine Learning ... We're so proud of this course.
π»2: Data Visualization ... Graph it up!
π§3: Recall 1
π»4: Correlation Coefficient ... One of many useful statistical concepts.
π§5: Recall 2
π»6: Inputs and Outputs ... Most models have inputs and predict outputs.
π§7: Recall 3
π»8: Simple Linear Regression ... The simplest model.
π§9: Recall 4
π»10: Visualizing SLR ... So dang useful.
π§11: Recall 5
π»12: Model Parameters ... Very important concept.
π§13: Recall 6
π»14: NumPy Vectorization ... Just a warmup. Feel free to skip!
π»15: Residuals ... This is how we define error in a regression model.
π§16: Recall 7
π»17: Mean Squared Error ... The most common error function.
π§18: Recall 8
βοΈ19: Final Assessment
-
Chapter 2: Machine Learning Fundamentals
Take your ML to the next level by incorporating feature engineering techniques such as Ordinal and One-Hot encoding, and preprocessing methods such as Standardization.
π»1: The Design Matrix ... Our input array used to train the model.
π§2: Recall 1
π»3: Multiple Linear Regression ... The most classic ML algorithm.
π§4: Recall 2
π»5: Regression Metrics ... There's a bunch of them. MSE is the most important one.
π§6: Recall 3
π»7: Categorical Variables ... We'll teach this properly unlike many resources.
π§8: Recall 4
π»9: Ordinal Categorical Variables ... Variables with an order to them.
π»10: Nominal Categorical Variables ... Variables without an order to them.
π§11: Recall 5
π»12: Preprocessing Methods ... Functions that scale (transform) numerical columns.
π§13: Recall 6
π»14: Bringing it all Together ... Let's make one heck of a model!
βοΈ15: Final Assessment
-
Chapter 3: Non-Linear Machine Learning
Unleash your inner ML God by mastering advanced Statistical Learning concepts such as Underfitting vs Overfitting, and the infamous Bias-Variance Tradeoff.
π»1: Non-Linear Models ... Straight is boring.
π»2: Linear vs Non-Linear Models ... Which is better? It depends.
π§3: Recall 1
π»4: Training Set vs Test Set | Part 1 ... Make sure you understand this!
π»5: Training Set vs Test Set | Part 2 ... Make sure you understand this!
π§6: Recall 2
π»7: Overfitting ... You've made a model that's TOO good.
π»8: Underfitting ... You've made a model that's not good enough.
π§9: Recall 3
π»10: Picking the Better Model ... We care about the Test Set!
π»11: Finding the Best Model ... Hyperparameter Tuning.
π§12: Recall 4
π»13: The Bias-Variance Tradeoff ... The Holy Grail of Machine Learning.
π§14: Recall 5
βοΈ15: Final Assessment
-
Chapter 4: Classification Models
Unlock ultimate ML power by transferring all your knowledge over to the other side of machine learning: Classification.
π»1: Regression vs Classification ... The difference lies in the output variable.
π§2: Recall 1
π»3: Logistic Regression ... The most classic classification algorithm.
π»4: Visualizing Logistic Regression ... We'll actually graph the probabilities.
π§5: Recall 2
π»6: Decision Boundaries ... They're so cool.
π§7: Recall 3
π»8: Confusion Matrix ... Don't let it confuse you too!
π§9: Recall 4
π»10: Precision & Recall ... 2 Common Binary Classification Metrics.
π»11: F1-Score ... An all-encompassing metric.
π§12: Recall 5
π»13: Binary Crossentropy Function ... Classification algorithms optimize this.
π»14: Picking the Best Model ... Similar to in regression, just different metrics.
βοΈ15: Final Assessment
-
Chapter 5: Bonus!
Become a Machine Learning Master by learning Multi-Class Classification and Unsupervised methods such as Clustering and Dimensionality Reduction.
π»1: Introduction to Multi-Class Classification ... The binary stuff extends to > 2.
π»2: Multi-Class Classification Metrics ... Generalized metrics.
π§3: Recall 1
π»4: Hold-Out Set ... We actually need a third dataset split.
π»5: Train Set / Validation Set / Hold-out Set ... We'll use that third dataset split.
π§6: Recall 2
π»7: Cross Validation ... The ultimate ML training. But it has a tradeoff...
π§8: Recall 3
π»9: Learning the Models ... A quick summary of popular ones.
π»10: Regularization ... A technique we use to help prevent overfitting.
π§11: Recall 4
π»12: A Taste of Forecasting / Time Series ... Sorry we can't do any more.
π§13: Recall 5
π»14: Unsupervised Learning: Clustering ... A taste of unsupervised learning.
π§15: Recall 6
π»16: Dimensionality Reduction (PCA) ... More unsupervised learning.
π§17: Recall 7
βοΈ18: Final Assessment