-
Notifications
You must be signed in to change notification settings - Fork 110
NLP model API tests broken #552
Comments
If I run: I get 90%. Do you not get the same? I'm running simple_labels.py now (going slowly because I need to rebuild my cache) |
No I get 70%. I ran again without using my cache and get the same results. I notice there's a difference between hello_classification.py and the simple_labels.py implementation: doc 2 uses "lettuce" in the former but "kale" in the latter. I think it should be "kale" in both b/c they both use "kale" in doc 6. |
I just ran:
and got 66.2% With regards to kale vs lettuce, I don't think it matters too much. Results should be the same either way. |
@BoltzmannBrain What is the exact command you used? |
This config file is something I've used locally and isn't included in the repo / API tests (yet). It's verys imilar to imbu_sensor_tm_simple_tp.json and the error isn't reproduced with that config. I've removed that part of this issue. Sorry for the confusion. |
Okay it looks like I have something locally that is to blame. Just to be sure, what nupic and nupic.core are you using? I'm on nupic 0.5.3dev0 and nupic.core 0.4.3dev0. |
The simple labels and hello classification tests for sensor_knn.json config fail -- 52.98% vs 66.2%, and 60% vs 80%, respectively.
@subutai would you please confirm you get the same?
The text was updated successfully, but these errors were encountered: