Skip to content

Commit

Permalink
siolves quiz on regularization with 100% accuracy
Browse files Browse the repository at this point in the history
  • Loading branch information
anishLearnsToCode committed Jun 14, 2020
1 parent c7574d6 commit 1dd729d
Show file tree
Hide file tree
Showing 23 changed files with 195 additions and 0 deletions.
Empty file added cv.m
Empty file.
1 change: 1 addition & 0 deletions test.m
Original file line number Diff line number Diff line change
Expand Up @@ -35,3 +35,4 @@


disp(computeCost(x, y, hypothesis));
gradientDescent()
23 changes: 23 additions & 0 deletions week2/linear-regression.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
clc;
clear;

function theta = normalizedLinearRegression(theta, X, y)
theta = inv(X' * X) * X' * y;
endfunction

function theta = normalizedLinearRegressionWithRegularization(theta, X, y)
regularizationParameter = 100;
features = size(X)(2) - 1;
regularizationMatrix = regularizationParameter * eye(features + 1);
regularizationMatrix(1, 1) = 0;
theta = inv(X' * X + regularizationMatrix) * X' * y;
endfunction

hypothesis = [0 ; 0];
data = [1 1 ; 1 2 ; 1 3];
result = [1 ; 2 ; 3];
optimizedHypothesis = normalizedLinearRegression(hypothesis, data, result);
disp(round(optimizedHypothesis));

optimizedHypothesis = normalizedLinearRegressionWithRegularization(hypothesis, data, result);
disp(optimizedHypothesis);
Binary file added week3/assets/logistic-regresion-quiz-1-b.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-1.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-2-b.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-2.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-3-3.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-3-b.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-3-w.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-4-2.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-4.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/logistic-regresion-quiz-5-w.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/regularization-1.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/regularization-2.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/regularization-3.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/regularization-4.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added week3/assets/regularization-5.PNG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 8 additions & 0 deletions week3/logistic-regression-quiz.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
## Logistic Regression Quiz

![Question 1](assets/logistic-regresion-quiz-1.PNG)
![Question 2](assets/logistic-regresion-quiz-2-b.PNG)
![Question 3]()
![Question 4](assets/logistic-regresion-quiz-4.PNG)
![Question 4](assets/logistic-regresion-quiz-4-2.PNG)
![Question 5]()
17 changes: 17 additions & 0 deletions week3/logistic_regression.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
clc;
clear;

function [value, gradient] = costFunction(theta)
value = (theta(1) - 5)^2 + (theta(2) - 10)^2;
gradient = zeros(2, 1);
gradient(1) = 2 * (theta(1) - 5);
gradient(2) = 2 * (theta(2) - 10);
endfunction


options = optimset('GradObj', 'on', 'MaxIter', 100);
initialTheta = zeros(2, 1);
[theta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options);
disp(theta);
disp(functionVal);
disp(exitFlag);
7 changes: 7 additions & 0 deletions week3/regularization-quiz.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Regularization Quiz

![Question 1](assets/regularization-1.PNG)
![Question 2](assets/regularization-2.PNG)
![Question 3](assets/regularization-3.PNG)
![Question 4](assets/regularization-4.PNG)
![Question 5](assets/regularization-5.PNG)
79 changes: 79 additions & 0 deletions week3/regularized-logistic-regression.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
clc;
clear;
close;

function lambda = regularizationFactor()
lambda = 10;
endfunction

function value = sigmoid(matrix)
value = 1 ./ (1 + exp(-matrix));
endfunction

function J = logisticRegressionCost(theta, X, y)
estimatedResults = sigmoid(X * theta);
trainingSamples = length(y);
J = -(1 / trainingSamples) * sum(
y .* log(estimatedResults)
+ (1 - y) .* log(estimatedResults)
);
endfunction

function J = logisticRegressionRegularizedCost(theta, X, y)
estimatedResults = sigmoid(X * theta);
trainingSamples = length(y);

J = (- 1 / trainingSamples) * sum(
y .* log(estimatedResults)
+ (1 - y) .* log(estimatedResults)
) + (regularizationFactor() / (2 * trainingSamples)) * (
sum(theta .^ 2) - theta(1) ^ 2
);
endfunction

function gradient = gradientVector(theta, X, y)
trainingExamples = length(y);
gradient = (1 / trainingExamples) * (X' * (sigmoidFunction(X * theta) - y));
endfunction

function gradient = regularizedGradientVector(theta, X, y)
trainingExamples = length(y);
gradient = gradientVector(theta, X, y);
modifiedHypothesis = (regularizationFactor() / trainingExamples) * theta;
modifiedHypothesis(1) = 0;
gradient += modifiedHypothesis;
endfunction

function [theta, costMemory, minCost] = gradientDescent(theta, X, y, iterations, learningRate)
costMemory = [logisticRegressionCost(theta, X, y)];
for i = 1:iterations
theta = theta - learningRate * gradientVector(theta, X, y);
costMemory = [costMemory logisticRegressionCost(theta, X, y)];
endfor
minCost = logisticRegressionCost(theta, X, y);
endfunction

function [theta, costs, minCost] = regularizedGradientDescent(theta, X, y, iterations, learningRate)
costs = [logisticRegressionRegularizedCost(theta, X, y)];
for i = 1:iterations
theta = theta - learningRate * regularizedGradientVector(theta, X, y);
costs = [costs logisticRegressionRegularizedCost(theta, X, y)];
endfor
minCost = logisticRegressionRegularizedCost(theta, X, y);
endfunction

hypothesis = [0 ; 0];
data = [1 1 ; 1 2 ; 1 3];
result = [1 ; 1 ; 1];

[theta, costMemory, minCost] = gradientDescent(hypothesis, data, result, 3000, 0.05);
% disp(theta);
disp(minCost);
subplot(2, 2, 1); plot(costMemory);
subplot(2, 2, 2); imagesc(theta);

[theta, costMemory, minCost] = regularizedGradientDescent(hypothesis, data, result, 3000, 0.05);
% disp(theta);
disp(minCost);
subplot(2, 2, 3); plot(costMemory);
subplot(2, 2, 4); imagesc(theta);
60 changes: 60 additions & 0 deletions week3/test.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
clear;
clc;

function value = sigmoidFunction(matrix)
value = (1 ./ (1 + exp(-matrix)));
endfunction

function J = cost(theta, X, y)
trainingExamples = length(y);
J = (1 / (2 * trainingExamples)) * sum((sigmoidFunction(X * theta) - y) .^ 2);
endfunction

function J = regularizedCost(theta, X, y)
trainingExamples = length(y);
regularizationParameter = 100;

J = (1 / (2 * trainingExamples)) * (
sum((sigmoidFunction(X * theta) - y) .^ 2)
+ regularizationParameter * (sum(theta .^ 2) - theta(1) ^ 2)
);
endfunction

function gradient = gradientVector(theta, X, y)
trainingExamples = length(y);
gradient = (1 / trainingExamples) * (X' * (sigmoidFunction(X * theta) - y));
endfunction

function gradient = regularizedGradientVector(theta, X, y)
trainingExamples = length(y);
regularizationParameter = 100;
gradient = (1 / trainingExamples) * (
X' * (sigmoidFunction(X * theta) - y)
+ regularizationParameter * theta
);
endfunction

function [value, gradient] = optimizationFunction(theta)
data = [1 1 ; 1 2 ; 1 3];
result = [1 ; 2 ; 3];
value = cost(theta, data, result);
gradient = gradientVector(theta, data, result);
endfunction

function [theta, costMemory, minCost] = gradientDescent(theta, X, y, iterations, learningRate)
costMemory = [cost(theta, X, y)];
for i = 1:iterations
theta = theta - learningRate * gradientVector(theta, X, y);
costMemory = [costMemory cost(theta, X, y)];
end
minCost = cost(theta, X, y);
endfunction

data = [1 1 ; 1 2 ; 1 3];
result = [1 ; 2 ; 3];
hypothesis = [10 ; 0];

[theta, costMemory, minCost] = gradientDescent(hypothesis, data, result, 100, 0.03);
disp(theta);
disp(minCost);
plot(costMemory);

0 comments on commit 1dd729d

Please sign in to comment.