Skip to content

Coursera - Programming Exercises - Machine Learning by Stanford University

Notifications You must be signed in to change notification settings

RWaiti/Coursera-Machine-Learning-Programming-Exercises

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Coursera - Programming Exercise - Machine Learning by Stanford University

Sendo feito em Octave 6.2.0, pois o Professor Andrew Ng percebeu que quando usava Octave os alunos entendiam mais a matéria, mas tenho objetivo de refazer em Python.

  • Programming Exercise 1 - Linear Regression

In this part of this exercise, you will implement linear regression with one variable to predict profits for a food truck. Suppose you are the CEO of a restaurant franchise and are considering different cities for opening a new outlet. The chain already has trucks in various cities and you have data for profits and populations from the cities. You would like to use this data to help you select which city to expand to next.

Exercise Part Exercise Submitted File Done
1 Warm up exercise warmUpExercise.m
2 Compute cost for one variable computeCost.m
3 Gradient descent for one variable gradientDescent.m
Training Data
Training Linear Regression with One Variable Result
Surface Contour with minimun(X)
Exercise Part Exercise Submitted File Done
4 Feature normalization featureNormalize.m
5 Compute cost for multiple variables computeCostMulti.m
6 Gradient descent for multiple variables gradientDescentMulti.m
7 Normal equations normalEqn.m



  • Programming Exercise 2 - Logistic Regression

In this part of the exercise, you will build a logistic regression model to predict whether a student gets admitted into a university. Suppose that you are the administrator of a university department and you want to determine each applicant's chance of admission based on their results on two exams. You have historical data from previous applicants that you can use as a training set for logistic regression. For each training example, you have the applicant's scores on two exams and the admissions decision. Your task is to build a classification model that estimates an applicant's probability of admission based the scores from those two exams.

Training Data
Non Regularized Logistic Regression Decision Boundary
Exercise Part Exercise Submitted File Done
1 Sigmoid function sigmoid.m
2 Compute cost for logistic regression costFunction.m
3 Gradient for logistic regression costFunction.m
4 Predict function predict.m
Training Data
Regularized Logistic Regression Decision Boundary
Underfitting Overfitting
Alt text Alt text
Exercise Part Exercise Submitted File Done
5 Compute cost for regularized LR costFunctionReg.m
6 Gradient for regularized LR costFunctionReg.m



  • Programming Exercise 3 - Multi-class Classification and Neural Networks

In this exercise, you will implement one-vs-all logistic regression and neural networks to recognize hand-written digits.

Training Data Sample (100 from 5000 images)
Exercise Part Exercise Submitted File Done
1 Regularized logistic regression lrCostFunction.m
2 One-vs-all classifier training oneVsAll.m
3 One-vs-all classifier prediction predictOneVsAll.m

In this part of the exercise, you will implement a neural network to recognize handwritten digits using the same training set as before. The neural network will be able to represent complex models that form non-linear hypotheses. For this week, you will be using parameters from a neural network that we have already trained. Your goal is to implement the feedforward propagation algorithm to use our weights for prediction.

Exercise Part Exercise Submitted File Done
4 Neural network prediction function predict.m

Octave doesn't have 0 index, so we are using 10 to index the number 0.

GIF



  • Programming Exercise 4 - Neural Networks Learning

In this exercise, you will implement the backpropagation algorithm for neural networks and apply it to the task of hand-written digit recognition.

Exercise Part Exercise Submitted File Done
1 Feedforward and Cost Function nnCostFunction.m
2 Regularized Cost Function nnCostFunction.m
3 Sigmoid Gradient sigmoidGradient.m
4 Neural Net Gradient Function (Backpropagation) nnCostFunction.m
5 Regularized Gradient nnCostFunction.m



  • Programming Exercise 5 - Regularized Linear Regression and Bias v.s. Variance

In this exercise, you will implement regularized linear regression and use it to study models with diferent bias-variance properties.

Training Data
Exercise Part Exercise Submitted File Done
1 Regularized Linear Regression Cost Function linearRegCostFunction.m
2 Regularized Linear Regression Gradient linearRegCostFunction.m
3 Learning Curve learningCurve.m
Linear Fit Linear Regression Learning Curve
Underfitting High Bias Problem (Underfit)
Exercise Part Exercise Submitted File Done
4 Polynomial Feature Mapping polyFeatures.m

Using Polynomial Feature Mapping to make the model more complex.

Polynomial Fit - lambda = 0 Polynomial Learning Curve
Overfitting High Variance Problem (Overfit)
Polynomial Fit - lambda = 3 Polynomial Learning Curve
Good Fit Low Variance and Bias
Exercise Part Exercise Submitted File Done
5 Cross Validation Curve validationCurve.m



  • Programming Exercise 6: Support Vector Machines

In this exercise, you will be using support vector machines (SVMs) to build a spam classifier.

Exercise Part Exercise Submitted File Done
1 Gaussian Kernel gaussianKernel.m
2 Parameters (C, σ) for Dataset 3 dataset3Params.m
3 Email Preprocessing processEmail.m
4 Email Feature Extraction emailFeatures.m



  • Programming Exercise 7 - K-means Clustering and Principal Component Analysis

In this exercise, you will implement the K-means clustering algorithm and apply it to compress an image. In the second part, you will use principal component analysis to find a low-dimensional representation of face images.

Exercise Part Exercise Submitted File Done
1 Find Closest Centroids findClosestCentroids.m
2 Compute Centroid Means computeCentroids.m
K-means iteration GIF
Image compression with K-means

In this exercise, you will use principal component analysis (PCA) to perform dimensionality reduction. You will first experiment with an example 2D dataset to get intuition on how PCA works, and then use it on a bigger dataset of 5000 face image dataset.

Exercise Part Exercise Submitted File Done
3 PCA pca.m
Dataset with Computed eigenvectors
Exercise Part Exercise Submitted File Done
4 Project Data projectData.m
5 Recover Data recoverData.m
Projected(red) and Reconstructed(blue) data
PCA on the face dataset
Original and Reconstructed Images



  • Programming Exercise 8 - Anomaly Detection and Recommender Systems

In this exercise, you will implement the anomaly detection algorithm and apply it to detect failing servers on a network. In the second part, you will use collaborative ltering to build a recommender system for movies.

Training Data
Training Data with Gaussian Estimation Contours
Exercise Part Exercise Submitted File Done
1 Estimate Gaussian Parameters estimateGuassian.m
2 Select Threshold selectThreshold.m
Detected Anomaly

In this part of the exercise, you will implement the collaborative filtering learning algorithm and apply it to a dataset of movie ratings.

Exercise Part Exercise Submitted File Done
3 Collaborative Filtering Cost cofiCostFunc.m
4 Collaborative Filtering Gradient cofiCostFunc.m
5 Regularized Cost cofiCostFunc.m
6 Gradient with regularization cofiCostFunc.m

Releases

No releases published

Packages

No packages published

Languages