mirror of
https://github.com/davisking/dlib.git
synced 2024-11-01 10:14:53 +08:00
Fixed grammar.
--HG-- extra : convert_revision : svn%3Afdd8eb12-d10e-0410-9acb-85c331704f74/trunk%403542
This commit is contained in:
parent
7d85e87d25
commit
b22f88c329
@ -141,7 +141,7 @@ int main()
|
||||
// Now that we have some data we want to train on it. However, there are two parameters to the
|
||||
// training. These are the nu and gamma parameters. Our choice for these parameters will
|
||||
// influence how good the resulting decision function is. To test how good a particular choice
|
||||
// of these parameters are we can use the cross_validate_trainer() function to perform n-fold cross
|
||||
// of these parameters is we can use the cross_validate_trainer() function to perform n-fold cross
|
||||
// validation on our training data. However, there is a problem with the way we have sampled
|
||||
// our distribution above. The problem is that there is a definite ordering to the samples.
|
||||
// That is, the first half of the samples look like they are from a different distribution
|
||||
|
@ -87,7 +87,7 @@ int main()
|
||||
// Now that we have some data we want to train on it. However, there is a parameter to the
|
||||
// training. This is the gamma parameter of the RBF kernel. Our choice for this parameter will
|
||||
// influence how good the resulting decision function is. To test how good a particular choice of
|
||||
// kernel parameters are we can use the cross_validate_trainer() function to perform n-fold cross
|
||||
// kernel parameters is we can use the cross_validate_trainer() function to perform n-fold cross
|
||||
// validation on our training data. However, there is a problem with the way we have sampled
|
||||
// our distribution. The problem is that there is a definite ordering to the samples.
|
||||
// That is, the first half of the samples look like they are from a different distribution
|
||||
|
@ -85,7 +85,7 @@ int main()
|
||||
// Now that we have some data we want to train on it. However, there are two parameters to the
|
||||
// training. These are the nu and gamma parameters. Our choice for these parameters will
|
||||
// influence how good the resulting decision function is. To test how good a particular choice
|
||||
// of these parameters are we can use the cross_validate_trainer() function to perform n-fold cross
|
||||
// of these parameters is we can use the cross_validate_trainer() function to perform n-fold cross
|
||||
// validation on our training data. However, there is a problem with the way we have sampled
|
||||
// our distribution above. The problem is that there is a definite ordering to the samples.
|
||||
// That is, the first half of the samples look like they are from a different distribution
|
||||
|
Loading…
Reference in New Issue
Block a user