-
This course is beginner friendly along with some tough challenges for intermidiate engineers in machines. Don't worry if you dont know python! Just give this a try 😄👍.
-
This are my machine learning notes that I have created in my learning journey of AI and Machine with the help of some top instructor from Deep Learning AI, This will surely help beginners and intermidiate people to learn and understand machine learning. Feel free to clone, and fork 🍴.
B
for Beginner,I
for intermediate level,A
for Advanced andO
for optional
BN
for Easy Notebook,IN
for Intermidiate, andT
for Notebook Assignment to test what you have learned!!
B
Supervised Machine Learning Part 1B
Supervised Machine Learning Part 2B
Unsupervised Machine Learning Part 1B
Unsupervised Machine Learning Part 2
- Jupyter notebook is a type of
IDE
in machine learning which provides to many options to the developers to create and run the code. Check the below file to get started!
Note: You should first clone the project and open it in your IDE or in jupyter notebook to understand it in depth.
N
JupyterNotebook (Clone and Open it in your IDE)
-
B
Linear Regression Part 1B
Linear Regression Part 2BN
Linear Regression Model Representation (Open it in your IDE like VS Code or Jupyter Notebook)B
Cost Function FormulaB
Cost Function IntuitionB
Visualizing the cost functionB
Visualization ExamplesBN
Cost Function Model Representation (Open it in your IDE and run the whole code by Shift+Enter)
-
B
Gradient DescentB
Implement Gradient DescentB
Gradient Descent IntuitionB
Learning RateB
Gradient Descent with Linear RegressionB
Running Gradient DescentBN
Gradient Descent Representation (Open it in your IDE and run the whole code by Shift+Enter)
-
B
Multiple FeaturesB
Vectorization Part 1B
Vectorization Part 2BN
Python, Numphy and Vectorization (Open it in your IDE and run the whole code by Shift+Enter)B
Gradient Descent for Multiple Linear RegressionBN
Multiple Variable Linear Regression (Open it in your IDE and run the whole code by Shift+Enter)
-
B
Feature Scaling Part 1B
Feature Scaling Part 2B
Checking Gradient Descent for ConvergenceB
Choosing the Learning RateBN
Feature Scaling and Learning Rate (Open it in your IDE and run the whole code by Shift+Enter)B
Feature EngineeringB
Polynomial RegressionBN
Feature Engineering and Polynomial Regression (Open it in your IDE and run the whole code by Shift+Enter)BN
Linear Regression with scikit-learn (Open it in your IDE and run the whole code by Shift+Enter)
-
T
Linear Regression Test Notebook/Lab (Clone and Open it in your IDE, follow all the instructions given and write the solution code) -
Don't take any pressure of it, hints and solution are given in the notebook as well
-
B
ClassificationBN
Classification (Open it in your IDE and run the whole code by Shift+Enter)B
Logistic RegressionBN
Sigmoid Function and Logistic Regression (Open it in your IDE and run the whole code by Shift+Enter)B
Decision BoundaryBN
Decision Boundary (Open it in your IDE and run the whole code by Shift+Enter)
-
Don't take any pressure of it, hints and solution are given in the notebook as well
-
Advanced Learning Algorithms (For => What you will learn in this part)
-
I
Neural Network LayerA
More Complex Neural NetworksA
Inference: Making PredictionsIN
Neurons and Layers (Open it in your IDE and run the whole code by Shift+Enter)
-
I
Inference in CodeI
Data in TensorFlowI
Building a Neural NetworkAN
Coffee Roasting in TensorFlow (Open it in your IDE and run the whole code by Shift+Enter)
-
I
Forward prop in single layerA
General Implementation of Forward PropogationIN
Coffee Roasting NumPy (Open it in your IDE and run the whole code by Shift+Enter)
-
Speculations on Artificial General Intelligence (AGI)
-
Don't take any pressure of it, hints and solution are given in the notebook as well
Note: The other files which are under development are commented, you can see the codeand files name by seeing the file it!
-
B
Introduction to Numpy ArraysB
Linear Systems as MatricesB
Introduction to the Numpy.lanalg sub-libraryI
Gaussian EliminationB
Vector Operations: Scalar Multiplication, Sum and Dot Product of VectorsB
Matrix MultiplicationB
Linear TransformationI
Linear Transformatins and Neural NetworksB
Interpreting Eigenvalues and EigenvectorsI
Application of Eigenvalues and Eigenvectors: Webpage navigation model and PCA
-
B
Differentiation in Python: Symbolic, Numerical and AutomaticB
Optimizing Functions of One Variable: Cost MinimizationB
Optimization Using Gradient Descent in One VariableB
Optimization Using Gradient Descent in Two VariablesI
Optimization Using Gradient Descent: Linear RegressionB
Regression with PerceptronB
Classification with PerceptronB
Optimization Using Newton's MethodI
Neural Network with Two Layers
-
B
Four Birthday ProblemsB
Monty Hall ProblemB
Exploratory Data Analysis: Intro to pandasB
Exploratory Data Analysis: Exploring your dataB
Naive BayesB
Summary statistics and visualization of Data SetsB
Exploratory Data Analysis: Data Visualization and SummaryB
Simulating Dice Rolls with NumpyB
Loaded DiceB
Sampling data from different distribution and studying the distribution of sample meanB
Exploratory Data Analysis: Linear RegressionB
Exploratory Data Analysis: Confidence Intervals with Hypoothesis TestingB
A/B Testing
!! Most of the notes credit of this note goes to the Great Teacher Andrew Ng and his great education websites like Deep Learning AI and Coursera
Welldone Champ