[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[computer-go] A New Razor




Robin,
I don't know anything about you, but it's clear you've 
done an impressive job of reading the computational 
learning theory literature and forming your own 
opinions. It's also clear, however, that you've missed 
many things in so doing. Neural nets do not have an 
infinite VC-dimension. My paper with Haussler in Neural 
Computation 1 1986 proved the VC-dimension
of neural nets was finite, if I recall correctly we 
bounded it between W and WlogN, where W is the number 
of weights and N the number of nodes. But the effective 
VC-dimension of large neural nets trained by backprop 
will be much lower (although nobody knows how to estimate 
it theoretically) because backprop doesn't exercise 
anywhere near all the degrees of freedom.
Likewise, Bayesian methods have their own Occam's razor--
there is an alternative, perhaps more intuitive, and
equally useful version of occam's razor that arises
naturally within Bayesian methods.

Chapter 4 of *What is Thought?* gives a pedagogical
treatment of the VC-dimension, MDL, and Bayesian routes
to Occam's razor. Chapter 6 discusses other topics, such
as effective dimensions. The treatment is unified and I
believe it is quite readable even though it gives many
of the mathematical intuitions-- friends of mine from 
various walks of life read it in draft. 
*What is Thought?* is also organized around the
goal of understanding cognition, which I believe 
makes all of its discussion and data unified and
relatively easy to follow.
With all due respect, I recommend you look at it.
It doesn't make a lot of sense for me to spend
6 years writing, much of that in the effort to
be pedagogically clear, and then rehash it in dribs
and drabs less clearly over the next 6 years.
http://www.whatisthought.com
_______________________________________________
computer-go mailing list
computer-go@xxxxxxxxxxxxxxxxx
http://computer-go.org/mailman/listinfo/computer-go