Fixed grammar.

--HG--
extra : convert_revision : svn%3Afdd8eb12-d10e-0410-9acb-85c331704f74/trunk%403542
This commit is contained in:
Davis King 2010-03-06 15:08:03 +00:00
parent 7d85e87d25
commit b22f88c329
3 changed files with 3 additions and 3 deletions

View File

@ -141,7 +141,7 @@ int main()
// Now that we have some data we want to train on it. However, there are two parameters to the
// training. These are the nu and gamma parameters. Our choice for these parameters will
// influence how good the resulting decision function is. To test how good a particular choice
// of these parameters are we can use the cross_validate_trainer() function to perform n-fold cross
// of these parameters is we can use the cross_validate_trainer() function to perform n-fold cross
// validation on our training data. However, there is a problem with the way we have sampled
// our distribution above. The problem is that there is a definite ordering to the samples.
// That is, the first half of the samples look like they are from a different distribution

View File

@ -87,7 +87,7 @@ int main()
// Now that we have some data we want to train on it. However, there is a parameter to the
// training. This is the gamma parameter of the RBF kernel. Our choice for this parameter will
// influence how good the resulting decision function is. To test how good a particular choice of
// kernel parameters are we can use the cross_validate_trainer() function to perform n-fold cross
// kernel parameters is we can use the cross_validate_trainer() function to perform n-fold cross
// validation on our training data. However, there is a problem with the way we have sampled
// our distribution. The problem is that there is a definite ordering to the samples.
// That is, the first half of the samples look like they are from a different distribution

View File

@ -85,7 +85,7 @@ int main()
// Now that we have some data we want to train on it. However, there are two parameters to the
// training. These are the nu and gamma parameters. Our choice for these parameters will
// influence how good the resulting decision function is. To test how good a particular choice
// of these parameters are we can use the cross_validate_trainer() function to perform n-fold cross
// of these parameters is we can use the cross_validate_trainer() function to perform n-fold cross
// validation on our training data. However, there is a problem with the way we have sampled
// our distribution above. The problem is that there is a definite ordering to the samples.
// That is, the first half of the samples look like they are from a different distribution