[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [computer-go] Weights are very important



On Mon, 12 Jan 2004, Vincent Diepeveen wrote:

> Yet a NN when trained will pick some random value from between -36000 and
> +36000 and start tuning with that.
> 
> As a human we know very well that it should be something between 1 and 40.

If you know a priori that the value should be between 1 and 40, why
on earth would you not initialize your neural net within that range
and let learning refine it from there?  When I seek optima in a design
space with prior knowledge, it's a very interactive process of mutual
education between myself and a learning machine: I set initial conditions
to where I think a good solution might lie, then let the optimizer tell
me what it makes of my choice.  This repeats until I'm satisfied with
the quantity and quality of optima found.  Now, are these hand-tuned
or machine-tuned?

Anyone who expects a present-day machine learning approach to produce
expert-level go play from a blank slate is bound to fail.  How well would
you play go if you were raised by wolves?  Machine learning, like extra
plies of search, is a tool for distilling a lot of computing power into
a bit of additional knowledge, best used when you have gone as far as
you can hand-coding your prior knowledge.  Nothing more, nothing less.

The main practical problem today is that the statistical framework of most
machine learning and the procedural framework of most go programs are not
a good match; hence the importance of work that tries to bridge this gap
one way or the other.

Best,

- nic

-- 
    Dr. Nicol N. Schraudolph                 http://n.schraudolph.org/
    Steinwiesstr. 32                         mobile:  +41-76-585-3877
    CH-8032 Zurich, Switzerland                 tel:      -1-251-3661

_______________________________________________
computer-go mailing list
computer-go@xxxxxxxxxxxxxxxxx
http://computer-go.org/mailman/listinfo/computer-go