Skip to content

Commit

Permalink
改用图片显示公式
Browse files Browse the repository at this point in the history
  • Loading branch information
Sanzo00 committed Nov 9, 2020
1 parent a7db467 commit 637253a
Show file tree
Hide file tree
Showing 18 changed files with 30 additions and 131 deletions.
30 changes: 6 additions & 24 deletions ex1-linear regression/ex1-multiple-variables.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -24,18 +24,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## 代价函数"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$\\begin{aligned}\n",
"h_\\theta(x)&= \\theta^Tx=\\theta_0+\\theta_1x_1 \\\\\n",
"J(\\theta)&=\\frac{1}{2m}\\sum_{i=1}^{m}(h_\\theta(x^{(i)}) - y^{(i)})^2\n",
"\\end{aligned}\n",
"$$"
"## 代价函数\n",
"\n",
"![](img/cost.png)"
]
},
{
Expand All @@ -54,18 +45,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## 梯度下降"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$\\begin{aligned}\n",
"\\theta_j:&=\\theta_j-\\alpha\\frac{\\partial}{\\partial\\theta_j}J(\\theta) \\\\\n",
"\\theta_j:&=\\theta_j-\\alpha\\frac{1}{m}\\sum_{i=1}^{m}(h_\\theta(x^{(i)})-y^{(i)})x_j^{(i)}\n",
"\\end{aligned}\n",
"$$"
"## 梯度下降\n",
"\n",
"![](img/gradient.png)"
]
},
{
Expand Down
15 changes: 3 additions & 12 deletions ex1-linear regression/ex1-normal-equation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -338,18 +338,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## 正则方程"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$\n",
"\\begin{aligned}\n",
"\\theta=(X^TX)^{-1}X^TY\n",
"\\end{aligned}\n",
"$$"
"## 正则方程\n",
"\n",
"![](img/regularization.png)"
]
},
{
Expand Down
30 changes: 6 additions & 24 deletions ex1-linear regression/ex1-one-variable.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -232,18 +232,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## 代价函数"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$\\begin{aligned}\n",
"h_\\theta(x)&= \\theta^Tx=\\theta_0+\\theta_1x_1 \\\\\n",
"J(\\theta)&=\\frac{1}{2m}\\sum_{i=1}^{m}(h_\\theta(x^{(i)}) - y^{(i)})^2\n",
"\\end{aligned}\n",
"$$"
"## 代价函数\n",
"\n",
"![](img/cost.png)"
]
},
{
Expand Down Expand Up @@ -560,18 +551,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## 梯度下降"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$\\begin{aligned}\n",
"\\theta_j:&=\\theta_j-\\alpha\\frac{\\partial}{\\partial\\theta_j}J(\\theta) \\\\\n",
"\\theta_j:&=\\theta_j-\\alpha\\frac{1}{m}\\sum_{i=1}^{m}(h_\\theta(x^{(i)})-y^{(i)})x_j^{(i)}\n",
"\\end{aligned}\n",
"$$"
"## 梯度下降\n",
"\n",
"![](img/gradient.png)"
]
},
{
Expand Down
Binary file added ex1-linear regression/img/cost.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex1-linear regression/img/gradient.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex1-linear regression/img/regularization.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
28 changes: 8 additions & 20 deletions ex2-logistic regression/ex2-logistic regression.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -260,10 +260,8 @@
"metadata": {},
"source": [
"## sigmoid函数\n",
"$$\n",
"h_\\theta(x)=g(\\theta^Tx) \\\\\n",
"g(z)=\\frac{1}{1+e^{-z}}\n",
"$$"
"\n",
"![](img/sigmoid.png)"
]
},
{
Expand Down Expand Up @@ -307,9 +305,8 @@
"metadata": {},
"source": [
"## 代价函数\n",
"$$\n",
"J(\\theta) = -\\frac{1}{m}\\sum_{i=1}^{m}[y^{(i)}log(h_\\theta(x^{(i)})) + (1-y^{(i)})log(1-h_\\theta(x^{(i)}))]\n",
"$$"
"\n",
"![](img/cost.png)"
]
},
{
Expand Down Expand Up @@ -432,11 +429,9 @@
"metadata": {},
"source": [
"## 梯度下降\n",
"$$\n",
"梯度下降:\\frac{\\partial J(\\theta)}{\\partial\\theta_j}=\\frac{1}{m}\\sum_{i=1}^{m}(h_\\theta(x^{(i)})-y^{(i)})x_j^{(i)}) \\\\\n",
"向量计算:\\frac{1}{m}X^T(sigmoid(X\\theta) - y)\\\\\n",
"\\theta_j = \\theta_j - \\alpha\\frac{\\partial}{\\partial\\theta_j}J(\\theta)\n",
"$$\n",
"\n",
"![](img/gradient.png)\n",
"\n",
"[偏导的推导过程](https://sanzo.top/#/post/学习笔记/机器学习?id=梯度下降)"
]
},
Expand Down Expand Up @@ -710,14 +705,7 @@
"source": [
"## 决策边界\n",
"\n",
"\n",
"$$\n",
"\\begin{aligned}\n",
"0&=X\\theta \\\\\n",
"0&=\\theta_0X_0 + \\theta_1X_1 + \\theta_2X_2\\\\\n",
"x2 &= -(\\frac{\\theta_0}{\\theta_2}x_0+\\frac{\\theta_1}{\\theta_2}x_1) \\\\\n",
"\\end{aligned}\n",
"$$"
"![](img/decision_boundary.png)"
]
},
{
Expand Down
42 changes: 4 additions & 38 deletions ex2-logistic regression/ex2-regularized logistic regression.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -254,32 +254,7 @@
"source": [
"## 特征映射\n",
"\n",
"$$\n",
"mapFeature(x)=\n",
"\\begin{bmatrix}\n",
"x_1^0x_2^0 \\\\\n",
"x_1^1x_2^0 \\\\\n",
"x_1^0x_2^1 \\\\\n",
"x_1^2x_2^0 \\\\\n",
"x_1^1x_2^1 \\\\\n",
"x_1^0x_2^2 \\\\\n",
"\\vdots \\\\\n",
"x_1^1x_2^5 \\\\\n",
"x_1^0x_2^6 \\\\\n",
"\\end{bmatrix}\n",
"=\n",
"\\begin{bmatrix}\n",
"1 \\\\\n",
"x_1 \\\\\n",
"x_2 \\\\\n",
"x_1^2 \\\\\n",
"x_1^1x_2^1 \\\\\n",
"x_2^2 \\\\\n",
"\\vdots \\\\\n",
"x_1^1x_2^5 \\\\\n",
"x_2^6 \\\\\n",
"\\end{bmatrix}\n",
"$$"
"![](img/feature_map.png)"
]
},
{
Expand Down Expand Up @@ -826,9 +801,8 @@
"metadata": {},
"source": [
"## 正则化代价函数\n",
"$$\n",
"J(\\theta) = -\\frac{1}{m}\\sum_{i=1}^{m}[y^{(i)}log(h_\\theta(x^{(i)})) + (1-y^{(i)})log(1-h_\\theta(x^{(i)}))] + \\frac{\\lambda}{2m}\\sum_{j=1}^{n}\\theta_j^2\n",
"$$"
"\n",
"![](img/regularized_cost.png)"
]
},
{
Expand Down Expand Up @@ -933,15 +907,7 @@
"source": [
"## 正则化梯度\n",
"\n",
"$$\n",
"\\frac{\\partial J(\\theta)}{\\partial\\theta_j}=\\frac{1}{m}\\sum_{i=1}^{m}(h_\\theta(x^{(i)})-y^{(i)})x_j^{(i)}) + \\frac{\\lambda}{m}\\theta_j \\\\\n",
"\\begin{aligned}\n",
" & Repeat\\ until\\ convergence \\{ \\\\ \n",
" & \\theta_0 := \\theta_0 - \\alpha \\frac{1}{m} \\sum_{i=1}^{m}\\left(h_\\theta(x^{(i)}) - y^{(i)}\\right) x_0^{(i)} \\\\\n",
" & \\theta_j := \\theta_j - \\alpha \\frac{1}{m} \\sum_{i=1}^{m}\\left(h_\\theta(x^{(i)}) - y^{(i)}\\right) x_j^{(i)} + \\frac{\\lambda}{m}\\theta_j \\\\\n",
"\\}\n",
"\\end{aligned}\n",
"$$"
"![](img/regularized_gradient.png)"
]
},
{
Expand Down
Binary file added ex2-logistic regression/img/cost.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex2-logistic regression/img/decision_boundary.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex2-logistic regression/img/feature_map.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex2-logistic regression/img/gradient.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex2-logistic regression/img/regularized_cost.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex2-logistic regression/img/sigmoid.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
16 changes: 3 additions & 13 deletions ex3-neural network/ex3-neural network.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -215,9 +215,7 @@
"metadata": {},
"source": [
"## 代价函数\n",
"$$\n",
"J(\\theta) = -\\frac{1}{m}\\sum_{i=1}^{m}[y^{(i)}log(h_\\theta(x^{(i)})) + (1-y^{(i)})log(1-h_\\theta(x^{(i)}))] + \\frac{\\lambda}{2m}\\sum_{j=1}^{n}\\theta_j^2\n",
"$$"
"![](img/cost.png)"
]
},
{
Expand Down Expand Up @@ -248,15 +246,7 @@
"metadata": {},
"source": [
"## 梯度函数\n",
"$$\n",
"\\frac{\\partial J(\\theta)}{\\partial\\theta_j}=\\frac{1}{m}\\sum_{i=1}^{m}(h_\\theta(x^{(i)})-y^{(i)})x_j^{(i)}) + \\frac{\\lambda}{m}\\theta_j \\\\\n",
"\\begin{aligned}\n",
" & Repeat\\ until\\ convergence \\{ \\\\ \n",
" & \\theta_0 := \\theta_0 - \\alpha \\frac{1}{m} \\sum_{i=1}^{m}\\left(h_\\theta(x^{(i)}) - y^{(i)}\\right) x_0^{(i)} \\\\\n",
" & \\theta_j := \\theta_j - \\alpha \\frac{1}{m} \\sum_{i=1}^{m}\\left(h_\\theta(x^{(i)}) - y^{(i)}\\right) x_j^{(i)} + \\frac{\\lambda}{m}\\theta_j \\\\\n",
"\\}\n",
"\\end{aligned}\n",
"$$"
"![](img/gradient.png)\n"
]
},
{
Expand Down Expand Up @@ -775,7 +765,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.7"
"version": "3.7.5"
}
},
"nbformat": 4,
Expand Down
Binary file added ex3-neural network/img/cost.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ex3-neural network/img/gradient.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 637253a

Please sign in to comment.