[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: computer-go: Authenticating the identity of a remote go-playing computer program



   From: William Harold Newman <william.newman@xxxxxxxxxxxxxxxxx>

   > You seem to accept that
   > it's ok to use deterministic random numbers (psuedo-random), why can't
								  ^^^^^^^^^
   > you  use  a deterministic  measure   of time?   Because  it  makes you
     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   > unhappy?   
     ^^^^^^^^

   Because you haven't convinced me that it's practical to do so. Because
   I'm not aware of any programs which successfully do so, and I think
   that's not a coincidence: trying to do so seems to me to be a severe
   limitation on a program.

It's very obvious that  you already understand  the basic mechanism to
do this, otherwise you  would not have   spent so much effort, in  the
continuation of your email, forbidding me to use this mechanism.  This
seems to be a pattern with you, because in the first email you did the
same thing by  trying to define  for me, what   I was allowed and  not
allowed to do.   You argue dirty!  

Don't you think  it would  be more productive  to  just concede that's
it's possible to  build a reasonably accurate  clock  in software?  It
doesn't make any sense  for a program like  Crafty to do this, because
it is far simpler to   just use the real   time clock that comes  with
every computer.  That's why you don't see it in general usage.   

Since  you already understand  how it's done  (I'll bet you could even
suggest improvements) I'll get right down to the nitty gritty and tell
you my experiences with implementing such a "clock."

A few years ago, I needed a way to measure the performance of my chess
program.  The trouble I was having was based on  the fact that I would
benchmark the   program on several different  platforms,  and it was a
pain, because I was  constantly translating run  times from one system
to the next.   My first approximation, was to  simply count the number
of moves actually applied (node counts) and use node counts instead of
"clock ticks" or seconds.

Now    this had  the    very  nice  advantage that    my  "times" were
deterministic  and reproducible among  platforms.   However it had one
flaw that you already alluded  to, in  some positions the  correlation
between time and  node counts was weaker  than in  other positions.  I
needed something   a little better   because in endgames   the program
generated considerably more  positions   than in  the  middlegame.   I
wanted a measurment that looked more like clock times.

A simple  analysis  and profile  of the  program  revealed (obvious in
retrospect) that in endgames the  program spends a different amount of
time generating  moves  versus making  them.  This is  because of beta
cutoffs I   assume.    I changed   my calculation    accordingly,  and
discovered that I had an impressively  accurate "clock."

Even though I never actually used this method for time control in real
games  I could have without substantially  weakening my program.  It's
true, this is not as accurate as the computers clock,  but it was more
than accurate enough to create a reliable time control algorithm.

It turned out that I got more enthused by the idea and found some more
ways   to improve the  accuracy  of this, but  most of  it was already
obtained by taking just    two  measurements, node counts   and  moves
generated.

Of  course a  different program,  such  as a go program  might require
different kinds   of instrumentation to obtain  an   accurate clock in
software, but the principles are the same.

Yes, I read what you  said about unpredictable memory access  patterns
and can only say that  I think you can  engineer around this.  I think
you can always end up with a reasonably accurate clock, which is all I
really want  to do.  We could  argue this one indefinitely because now
we are talking about greyer areas.  I concede that a clock in hardware
will always be more accurate but I've never claimed otherwise.  I will
also concede  that it takes a  lot more effort  to do things this way,
and I've  also never claimed  otherwise.   I'll concede  that it's not
very practical,  but I've never  claimed otherwise.  I'll also concede
that someone is going to  send me another email  asking me why I think
everyone should have to rewrite their program, but I never said this!
Really.


   If you can't show a successful example of your design philosophy, I'll
   be predisposed to be skeptical. However, you could still start
   convincing me by explaining, in somewhat more detail than you have so
   far, how you propose to make your application code estimate the
   elapsed runtime in the presence of nontrivial, input-data-dependent
   memory access patterns.

Your  strongest argument  against determinstic Go   programs has to do
with multiple threads  of execution, especially in  parallel programs.
I have been waiting for someone to bring  this one up and was prepared
to "resign" as soon as it was mentioned.   I don't know  of any way to
make a parallel  alpha/beta search deterministic  on a modern parallel
computer and I  wouldn't want to see  this kind of program banned from
competition.   So there!!


Don