[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: computer-go: reducing to number of search strings
On Mon, 27 Aug 2001, Nicol Schraudolph wrote:
> On Mon, 27 Aug 2001, Petersen Kjeld-WFKP1396 wrote:
>
> > The thing that made the program so fast was the
> > method for the calculations. The fractals were
> > calculated in integer instead of reals. Has
> > anybody made a integer version of a NN to speed
> > up a Go program ??
>
> A good way to speed up neural net calculations is
> to use the multimedia extensions (MMX, AltiVec, etc.)
> of modern CPUs. Going to integer arithmetic (as
> e.g. MMX requires) has its problems though in that
> the gradient calculations must be quite precise
> for learning subtle, complex problems. IMO this
> is opening a can of worms, and not worth it given
> that new CPUs nowadays supply floating-point
> multimedia extensions.
I agree completely with Nici. In particular, if you
are using fully connected neural networks and write
your NN code as matrix operations, then you can use
highly optimized matrix libraries such as ATLAS
(http://netlib.org/atlas) which can provide nearly
optimal performance on these new processors. I
seem to recall seeing somewhere that you can get
over 1.2GFLOP for double precision on modern machines
and over 4GFLOP for single precision. From:
http://www.netlib.org/atlas/atlas-comm/msg00194.html
There are (at least) two good papers on implementing
neural networks using matrix operations:
- D. Anguita, G. Parodi, and R. Zunino. An efficient
implementation of BP on RISC-based workstations.
Neurocomputing, 6:57-65, 1994.
- J. Bilmes, K. Asanovic, C. Chin, and J. Demmel.
Using PHIPAC to speed error back-propagation
learning. ICASSP'97. 1997.
Cheers,
Carl
_________________________________________________
[(hp)] Carl Staelin
Senior Research Scientist
Hewlett-Packard Laboratories
Technion City
Haifa, 32000
ISRAEL
+972(4)823-1237x221 +972(4)822-0407 fax
staelin@xxxxxxxxxxxxxxxxx
_______http://www.hpl.hp.com/personal/Carl_Staelin_______