Skip to content

Graduate school course in Machine Learning and Neural Networks - my weekly assignments.

Notifications You must be signed in to change notification settings

cbroker1/Intro-to-Machine-Learning-and-Nerual-Networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 

Repository files navigation

Machine-Learning-and-Nerual-Networks

Semester-long graduate school class, taken at Johns Hopkins University - weekly assignment descriptions below.

Week 1 - Introduction to Machine Learning and Neural Networks (no corresponding files)

Recieved an overview of the architecture of the course as well as an introduction to machine learning and neural networks.

Week 2 - Python + Pandas + Numpy Basics

Became comfortable with using Python, Jupyter Notebooks, Pandas, and Numpy.

Week 3 - Calculus and Linear Algebra's Roles in Machine Learning

In this module I explored the role of Calculus in Machine Learning, namely in optimization to find the best fit model. Additionally, I looked at how all equations are expanded from single variable inputs to 'multivariable' through Linear Algebra. I explored how Linear Regression uses Calculus and finds the best fit model through optimization in both one-dimension along with multiple dimensions.

Week 4 - Practical Linear Regression and Cost Functions

I focused on Linear Regression in a more rigorous and practical method. I explored different ways to use Linear Regression along with a real dataset. I looked at the cost function and how the choice of function can improve or hurt results.

Week 5 - Validating Models

Introduced to train/test/validate in the development of robust models.

Week 6 - From Regression to Classification: Logistic Regression

Introduced to concept of classification. The Logistic Function emphasized.

Week 7 - Garbage In, Garbage Out (Data Munging): Extracting Relevant Features from the Data

Exposed to techniques to help clean data. Allowed data to properly into an algorithm to generate the most effective model.

Week 9 - Unsupervised Learning Part 1

Week 9 - Part 2

Exposed to algorithms that do not require supervision.

Week 10 - Natural Language Processing (NLP)

In this module I covered working with Text Data and conversions such as TF-IDF and Word2Vec. These techniques allowed text data to be converted into (logical) numerical data, allowing the data to be modeled.

Week 11 - Neural Networks: From Support Vector Machines to Logistic Regression

In this module I made the jump to neural networks. Introduced to the diagrams and graphs that represent these networks and some of the math behind them. Shown how adding additional layers or additional neurons can improve performance over a Support Vector Machine.

Week 12 - Neural Networks (Deep Learning)

Introduced to different Neural Networks. I explored when to use each architecture and potential models for specific situations.