diff --git a/ml1/2_6_Model_Tuning.ipynb b/ml1/2_6_Model_Tuning.ipynb index c9c9fd7..49d6333 100644 --- a/ml1/2_6_Model_Tuning.ipynb +++ b/ml1/2_6_Model_Tuning.ipynb @@ -314,7 +314,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Changing manually the hyperparameters to find their optimal values is not practical. Instead, we can consider to find the optimal value of the parameters as an *optimization problem*. \n", + "Changing manually the hyperparameters to find their optimal values is not practical. Instead, we can consider to find the optimal value of the hyperparameters as an *optimization problem*. \n", "\n", "The sklearn comes with several optimization techniques for this purpose, such as **grid search** and **randomized search**. In this notebook we are going to introduce the former one." ] @@ -323,7 +323,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The sklearn provides an object that, given data, computes the score during the fit of an estimator on a parameter grid and chooses the parameters to maximize the cross-validation score. " + "The sklearn provides an object that, given data, computes the score during the fit of an estimator on a hyperparameter grid and chooses the hyperparameters to maximize the cross-validation score. " ] }, { @@ -371,7 +371,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can now evaluate the KFold with this optimized parameter as follows." + "We can now evaluate the KFold with this optimized hyperparameter as follows." ] }, { @@ -431,7 +431,7 @@ "scores = ['precision', 'recall']\n", "\n", "for score in scores:\n", - " print(\"# Tuning hyper-hyperparameters for %s\" % score)\n", + " print(\"# Tuning hyperparameters for %s\" % score)\n", " print()\n", "\n", " if score == 'precision':\n",