-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Using active_interactor.py
Start vw in active learning mode and listening on a port together with all the other options you need
vw --active --port 6075
active_interactor.py talks to vw in plain text via a TCP socket. It needs to know where vw is listening on and a set of unlabeled examples. These should be in text VW format except that the label should be missing. Suppose vw is running on the same machine, invoked as above, and the unlabeled data is in file unlabeled.dat. Then
python active_interactor.py localhost 6075 unlabeled.dat
will connect to vw and start sending unlabeled examples in order. For each example that vw wants labeled, the user can specify the label (0,1) or 'skip' in which case no label is provided and it's (almost) as if the unlabeled dataset did not contain the example.
The following options are currently supported by active_interactor.py
- -s labeled_set. Uses the labeled examples in labeled_set to seed vw's model prior to sending any unlabeled examples.
- -v. Verbose mode. Prints the unlabeled example before asking for the label (otherwise only a line number and a tag are used). This makes sense for text classification with data in close to raw form.
- -m. Interprets a 0 as -1
- -o output. Writes the user provided labels (and corresponding examples) to a (line buffered) file. The absence of full buffering allows one to hit Ctrl-C anytime and still have all their labels saved in the output file.
Choice of hyperparameters is tricky. If you have a labeled subsample, use it together with --active --simulation
to discover a good value for --mellowness
and the learning rate.
- Home
- First Steps
- Input
- Command line arguments
- Model saving and loading
- Controlling VW's output
- Audit
- Algorithm details
- Awesome Vowpal Wabbit
- Learning algorithm
- Learning to Search subsystem
- Loss functions
- What is a learner?
- Docker image
- Model merging
- Evaluation of exploration algorithms
- Reductions
- Contextual Bandit algorithms
- Contextual Bandit Exploration with SquareCB
- Contextual Bandit Zeroth Order Optimization
- Conditional Contextual Bandit
- Slates
- CATS, CATS-pdf for Continuous Actions
- Automl
- Epsilon Decay
- Warm starting contextual bandits
- Efficient Second Order Online Learning
- Latent Dirichlet Allocation
- VW Reductions Workflows
- Interaction Grounded Learning
- CB with Large Action Spaces
- CB with Graph Feedback
- FreeGrad
- Marginal
- Active Learning
- Eigen Memory Trees (EMT)
- Element-wise interaction
- Bindings
-
Examples
- Logged Contextual Bandit example
- One Against All (oaa) multi class example
- Weighted All Pairs (wap) multi class example
- Cost Sensitive One Against All (csoaa) multi class example
- Multiclass classification
- Error Correcting Tournament (ect) multi class example
- Malicious URL example
- Daemon example
- Matrix factorization example
- Rcv1 example
- Truncated gradient descent example
- Scripts
- Implement your own joint prediction model
- Predicting probabilities
- murmur2 vs murmur3
- Weight vector
- Matching Label and Prediction Types Between Reductions
- Zhen's Presentation Slides on enhancements to vw
- EZExample Archive
- Design Documents
- Contribute: