mirror of
https://github.com/ArthurDanjou/handson-ml3.git
synced 2026-01-14 12:14:36 +01:00
Replace handson-ml2 with handson-ml3, and fix figure chapter numbers
This commit is contained in:
@@ -19,10 +19,10 @@
|
||||
"source": [
|
||||
"<table align=\"left\">\n",
|
||||
" <td>\n",
|
||||
" <a href=\"https://colab.research.google.com/github/ageron/handson-ml2/blob/master/math_differential_calculus.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
|
||||
" <a href=\"https://colab.research.google.com/github/ageron/handson-ml3/blob/main/math_differential_calculus.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
|
||||
" </td>\n",
|
||||
" <td>\n",
|
||||
" <a target=\"_blank\" href=\"https://kaggle.com/kernels/welcome?src=https://github.com/ageron/handson-ml2/blob/master/math_differential_calculus.ipynb\"><img src=\"https://kaggle.com/static/images/open-in-kaggle.svg\" /></a>\n",
|
||||
" <a target=\"_blank\" href=\"https://kaggle.com/kernels/welcome?src=https://github.com/ageron/handson-ml3/blob/main/math_differential_calculus.ipynb\"><img src=\"https://kaggle.com/static/images/open-in-kaggle.svg\" /></a>\n",
|
||||
" </td>\n",
|
||||
"</table>"
|
||||
]
|
||||
@@ -476,7 +476,7 @@
|
||||
"id": "ebb31wJp72Zn"
|
||||
},
|
||||
"source": [
|
||||
"**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml2/blob/master/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)."
|
||||
"**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml3/blob/main/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -1064,7 +1064,7 @@
|
||||
" zs = f(xs, ys)\n",
|
||||
"\n",
|
||||
" surface = ax.plot_surface(xs, ys, zs,\n",
|
||||
" cmap=mpl.cm.coolwarm,\n",
|
||||
" cmap=\"coolwarm\",\n",
|
||||
" linewidth=0.3, edgecolor='k')\n",
|
||||
"\n",
|
||||
" ax.set_xlabel(\"$x$\", fontsize=14)\n",
|
||||
|
||||
Reference in New Issue
Block a user