This is the vowpal wabbit fast online learning code. It is Copyright
(c) 2009 Yahoo! Inc, and released for use under the BSD (revised) open
Contributing authors are John Langford (Primary), Lihong Li, Alex
Strehl, and Shubham Chopra, and Gordon Rios. This is the second VW
release, and our intention is to create an open source project this
Checked into github with clone UR
The implementation of Non-linear (Preconditioned) Conjugate Gradient
works on clusters (both Hadoop and otherwise) now.
To build the code, run make.
At a high level, the code operates by repeatedly executing something
equivalent to the MPI AllReduce function---adding up floats from all
nodes then broadcasting them back to each individual node. In order
to do this, a spanning tree over the nodes
Test suite for vw:
You may add arbitrary (train/test/varying-options) tests
1) Data (train, test, predict) files
2) vw command to run using the above data files
3) Expected (reference) STDOUT, STDERR, and predictions-file
for the command.
Additional tests can be added below the __DATA__ section in RunTests.
See the comment above __DATA__ in 'RunT
Software Copyright License Agreement (BSD License)
The copyrights to the software code for Vowpal W
Browse inside vowpal-wabbit_5.1+83-gffab10a.orig.tar.gz
Results 1 - 1 of 1Search over 15 billion files
© 1997-2017 FileWatcher.com