-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Java
This doc describes the basic usage of Java API for Vowpal Wabbit. This Java API uses the Java Native Interface to communicate between Java and the native Vowpal Wabbit code. The philosophy of this layer is to expose a simple type-safe, thread-safe interface so that Java developers can use Vowpal Wabbit for learning and inference.
(Tested on 64-bit Ubuntu 20.04)
In addition to the VW dependencies, Java Development Kit (JDK) and Apache Maven need to be installed.
As an example, on Ubuntu 20.04, they can be installed with apt
:
sudo apt install default-jdk
sudo apt install maven
Although Java API is supported, its build is disabled by default. To enable Java build, turn on CMake option BUILD_JAVA
as follows:
git clone --recursive https://github.com/VowpalWabbit/vowpal_wabbit.git
cd vowpal_wabbit/
mkdir build
cd build
cmake -DBUILD_JAVA=ON ..
make -j$(nproc)
It is also possible to turn on STATIC_LINK_VW_JAVA
to link the vw-jni library statically.
After executing the commands above, JAR files will be created under directory vowpal_wabbit/java/target/
. For example, the vw-jni-<VERSION>.jar
can then be used to enable VW in other projects.
Note that the native dependencies are included in the JAR file. In case it does not work (because of this issue), the compiled JNI library can be found at java/target/bin/natives/*vw_jni*
, where the *
are platform-specific. For example, on 64-bit Ubuntu, the shared library is linux_64/libvw_jni.so
, which then needs to be installed to the java.library.path
.
Finally, a CI build pipeline can be found here.
Instead of building from source files, published JAR files (built for linux_64
) can be downloaded from the maven repository.
To verify the JAR works, execute the following command (after replacing <VERSION>
) to run a test VW program:
java -cp vw-jni-<VERSION>.jar vowpalWabbit.VW
The example program below demonstrates the same regression problem using a house dataset in this VW tutorial.
import vowpalWabbit.learner.*;
class Example {
public static void main(String[] args) {
// Create a model.
VWScalarLearner learner = VWLearners.create("-f model.vw");
// Learn some data.
learner.learn("0 | price:.23 sqft:.25 age:.05 2006");
learner.learn("1 2 'second_house | price:.18 sqft:.15 age:.35 1976");
learner.learn("0 1 0.5 'third_house | price:.53 sqft:.32 age:.87 1924");
// Closing finalizes the model and frees up the native memory.
try {
learner.close();
} catch (Exception e) {
e.printStackTrace();
}
// Use the model for inference.
learner = VWLearners.create("-i model.vw -t --quiet");
// Compute a prediction.
float prediction = learner.predict("| price:0.23 sqft:0.25 age:0.05 2006");
// Output the prediction result.
System.out.println(prediction);
// Close and cleanup.
try {
learner.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
Assuming the JAR file (vw-jni-<VERSION>.jar
) is in the same directory of this example Java program, compile and run it as follows:
javac -cp vw-jni-<VERSION>.jar Example.java
java -cp .:vw-jni-<VERSION>.jar Example
As aforementioned, the JNI library (e.g., libvw_jni.so
) might need to be manually specified on certain platforms (-Djava.library.path=<PATH_TO_VW_JNI>
). More examples of using other learners can be found in the tests.
To improve the performance when hosting VW in Spark, an additional optimized layer can be found in org.vowpalwabbit.spark.*
. An example of its usage can be found in the test here.
In the future, we would like to converge into one unified set of Java API.
There are also other open-source Java wrappers for VW, such as vowpal-wabbit-java, if you wish to use them.
We always welcome community contributions to the Java interface of VW. Thank you to all community contributors.
- Home
- First Steps
- Input
- Command line arguments
- Model saving and loading
- Controlling VW's output
- Audit
- Algorithm details
- Awesome Vowpal Wabbit
- Learning algorithm
- Learning to Search subsystem
- Loss functions
- What is a learner?
- Docker image
- Model merging
- Evaluation of exploration algorithms
- Reductions
- Contextual Bandit algorithms
- Contextual Bandit Exploration with SquareCB
- Contextual Bandit Zeroth Order Optimization
- Conditional Contextual Bandit
- Slates
- CATS, CATS-pdf for Continuous Actions
- Automl
- Epsilon Decay
- Warm starting contextual bandits
- Efficient Second Order Online Learning
- Latent Dirichlet Allocation
- VW Reductions Workflows
- Interaction Grounded Learning
- CB with Large Action Spaces
- CB with Graph Feedback
- FreeGrad
- Marginal
- Active Learning
- Eigen Memory Trees (EMT)
- Element-wise interaction
- Bindings
-
Examples
- Logged Contextual Bandit example
- One Against All (oaa) multi class example
- Weighted All Pairs (wap) multi class example
- Cost Sensitive One Against All (csoaa) multi class example
- Multiclass classification
- Error Correcting Tournament (ect) multi class example
- Malicious URL example
- Daemon example
- Matrix factorization example
- Rcv1 example
- Truncated gradient descent example
- Scripts
- Implement your own joint prediction model
- Predicting probabilities
- murmur2 vs murmur3
- Weight vector
- Matching Label and Prediction Types Between Reductions
- Zhen's Presentation Slides on enhancements to vw
- EZExample Archive
- Design Documents
- Contribute: