Made some comments more clear

--HG--
extra : convert_revision : svn%3Afdd8eb12-d10e-0410-9acb-85c331704f74/trunk%403309
pull/2/head
Davis King 15 years ago
parent afc622bf62
commit ea67271ac2

@ -220,15 +220,15 @@ int main()
// Lastly, note that the decision functions we trained above involved well over 100
// support vectors. Support vector machines in general tend to find decision functions
// that involve a lot of support vectors. This is significant because the more
// support vectors in a decision function, the longer it takes to classify new examples.
// basis vectors. Support vector machines in general tend to find decision functions
// that involve a lot of basis vectors. This is significant because the more
// basis vectors in a decision function, the longer it takes to classify new examples.
// So dlib provides the ability to find an approximation to the normal output of a
// support vector machine using fewer support vectors.
// trainer using fewer basis vectors.
// Here we determine the cross validation accuracy when we approximate the output
// using only 10 support vectors. To do this we use the reduced2() function. It
// takes a trainer object and the number of support vectors to use and returns
// using only 10 basis vectors. To do this we use the reduced2() function. It
// takes a trainer object and the number of basis vectors to use and returns
// a new trainer object that applies the necessary post processing during the creation
// of decision function objects.
cout << "\ncross validation accuracy with only 10 support vectors: "
@ -239,7 +239,7 @@ int main()
<< cross_validate_trainer(trainer, samples, labels, 3);
// When you run this program you should see that, for this problem, you can reduce
// the number of support vectors down to 10 without hurting the cross validation
// the number of basis vectors down to 10 without hurting the cross validation
// accuracy.

Loading…
Cancel
Save