updated the docs

--HG--
extra : convert_revision : svn%3Afdd8eb12-d10e-0410-9acb-85c331704f74/trunk%402231
This commit is contained in:
Davis King 2008-05-13 02:05:00 +00:00
parent 07140ec98d
commit 555310d8e4
4 changed files with 35 additions and 8 deletions

View File

@ -12,14 +12,8 @@
<p>
This page documents library components that are all basically just implementations of
mathematical functions or algorithms without any really significant data structures
associated with them. So this includes things like checksums, cryptographic hashes, sorting, etc...
</p>
<p>
Everything in this section basically follows the same conventions as
the rest of the library. So to get a bigint for example you would need to write something
like <tt>typedef dlib::bigint::kernel_2a bint;</tt> and from then on make your big ints like
<tt>bint my_bigint;</tt>.
associated with them. So this includes things like checksums, cryptographic hashes,
machine learning algorithms, sorting, etc...
</p>
</body>
@ -61,6 +55,7 @@
<item>linear_kernel</item>
<item>decision_function</item>
<item>probabilistic_decision_function</item>
<item>krls</item>
</sub>
</item>
</section>
@ -664,6 +659,25 @@
</description>
</component>
<!-- ************************************************************************* -->
<component>
<name>krls</name>
<file>dlib/svm.h</file>
<spec_file link="true">dlib/svm/krls_abstract.h</spec_file>
<description>
This is an implementation of the kernel recursive least squares algorithm
described in the paper The Kernel Recursive Least Squares Algorithm by Yaakov Engel.
<p>
The long and short of this algorithm is that it is an online kernel based
regression algorithm. You give it samples (x,y) and it learns the function
f(x) == y. For a detailed description of the algorithm read the above paper.
</p>
</description>
</component>
<!-- ************************************************************************* -->
<component>

View File

@ -115,6 +115,7 @@
<ul>
<li><a href="algorithms.html#mlp">multi layer perceptrons</a> </li>
<li><a href="algorithms.html#svm_nu_train">nu support vector machines</a></li>
<li><a href="algorithms.html#krls">kernel RLS regression</a></li>
<li>Bayesian Network inference algorithms such as the
<a href="algorithms.html#bayesian_network_join_tree">join tree</a> algorithm and
<a href="algorithms.html#bayesian_network_gibbs_sampler">Gibbs sampler</a> Markov Chain Monte Carlo algorithm</li>

View File

@ -12,12 +12,23 @@
<current>
New Stuff:
- Added an implementation of the kernel recursive least squares algorithm
Non-Backwards Compatible Changes:
- Broke backwards compatability in the directed_graph_drawer's serialization
format when I fixed the bug below.
Bug fixes:
- Fixed two bugs in the directed_graph_drawer widget. First, it sometimes
threw a dlib::fatal_error due to a race condition. Second, the color of
the nodes wasn't being serialized when save_graph() was called.
- Made vector_to_matrix() work for std::vector objects that have non-default
allocators.
Other:
- Added some stuff to make people get a really obvious error message
when they set up the include path incorrectly.
</current>
<!-- ******************************************************************************* -->

View File

@ -374,6 +374,7 @@
<term link="algorithms.html#svm_nu_train" name="support vector machine"/>
<term link="algorithms.html#vector" name="vector"/>
<term link="algorithms.html#point" name="point"/>
<term link="algorithms.html#krls" name="krls"/>
<term link="dlib/svm/svm_abstract.h.html#maximum_nu" name="maximum_nu"/>