-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
multiple selections on parcoords axes #2415
Conversation
- replaced d3.svg.brush with our own implementation supporting multiple selections - eliminated non-brush related logic from axisbrush.js for eventual reusability - general code improvements in areas that were in attention
src/traces/parcoords/attributes.js
Outdated
@@ -80,8 +80,22 @@ module.exports = { | |||
valType: 'info_array', | |||
role: 'info', | |||
items: [ | |||
{valType: 'number', editType: 'calc'}, | |||
{valType: 'number', editType: 'calc'} | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This just shows the current structure, there won't be a limit of two specified intervals.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(for my own notes, @monfera don't worry about this unless it's blocking you)
If we merge in master, particularly #2399, we can use 2D and/or unbounded free-length info_array
.
I'm not sure the best way to handle it though - since the vast majority of cases will still use a single range, I'd like to keep constraintrange
1D rather than forcing people to wrap it into 2D. So the two options that occur to me are:
- make it a length-2N array, ie
[min1, max1, min2, max2, min3, max3...]
- this would be fairly simple for us to work with, and seems easy to describe to users, not sure how intuitive it will be for them to use though. - make a separate 2D attribute
constraintranges
that we switch to when there are multiple ranges. That would be more plumbing on our side, and we'd have to decide which one wins in a conflict, but the structure may be more intuitive.
@etpinard thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Users should not have to make constraintrange
2D to set a single constraint.
I'll like to add a third option, make constraintrange
1D OR 2D. This would be a first for info arrays, but note that some data arrays already use this pattern (e.g. heatmap x
and y
).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another aspect is that the functionality is initially discussed specifically for ordinal dimensions, where it'd be natural to simply list values, eg. [1, 5, 8, 12]
, and even to mix singular values and ranges, like [1, [5, 8], 12]
meaning 1, 5 to 8 and 12. It would be possible to support this, but it wouldn't blend well with using an 1D array for a single range, because then an 1D array with two elements could be interpreted as either a single range, or as two discrete values.
src/traces/parcoords/axisbrush.js
Outdated
var intData = getInterval(b, d.unitScaleInOrder, y); | ||
var unitRange = intData.interval; | ||
var pixelRange = unitRange.map(d.unitScaleInOrder); | ||
var s = b.svgBrush2; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
upcoming commit will remove the 2
post-fix here and there (we used to have both D3 and our own brush for manual regression testing but not anymore)
src/traces/parcoords/parcoords.js
Outdated
@@ -218,13 +233,36 @@ function viewModel(model) { | |||
xScale: xScale, | |||
x: xScale(i), | |||
canvasX: xScale(i) * canvasPixelRatio, | |||
unitScale: unitScale(height, c.verticalPadding), | |||
// fixme remove the old unitScale |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unitScale
is still being used for its inverse mapping due to how y
values are arriving from the underlying d3
interaction callback calls, but could in theory be unified, simplifying the current two-step map eg. var y = d.unitScaleInOrder(d.unitScale.invert(d3.mouse(this)[1]));
- added bitmask tests while retaining the original 1D 'bounding box' test - removed SPLOM work seed - due to lightening up some of the shaders, do less work on the JS side too -> baseline case (1 range): 1/3 times the work in the vertex shader -> new use (>1 ranges): still rather fast
temporarily, so they'll work until we sort out the constraintrange attribute
and clean up remaining brush/brush2 cruft
- more precise mapping of constraints to data - which enables decreased filterEpsilon - only make the constraint mask texture once per render - and update it on subsequent renders, so we're not leaking THAT - let its height be independent of canvasHeight, and set at 2K - speed up mask generation by only unsetting, not unsetting and resetting - always overshoot ordinal values for clarity - when picking ordinal values, map closer to what your already showed instead of often jumping to cover an extra value
- use new constraintrange info_array 1-2 dimension format - add multiselect attribute - some further cleanup of conversions - image test to cover most of multiselect logic & precision - interaction tests of ordinal and continuous multiselect - put constraint ranges back into existing mocks
@@ -62,6 +62,8 @@ | |||
}, | |||
{ | |||
"label": "Min height width", | |||
"range": [200000, -30000], | |||
"constraintrange": [0, 100000], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is covered by the gl2d_parcoords_constraints
mock too, but I added a reversed-axis constraint to this during my tests.
"constraintrange": [[-10, -5], [-1, 1], [5, 10]], | ||
"values": [ | ||
-9, -9, -7, -7, | ||
-10.001, 10.001, -4.98, 4.98, -10, 10, -5, 5, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nearly every sample on this plot is testing something... the values arrays are broken into lines based on which dimension is being tested. This dimension is testing that precision on the outer bounds of the constraint range is limited by floating point (OK, I could have added another zero or two, 10.00001
or so...) and precision in the gaps between selection ranges is not as good, but this test (4.98
) shows at least one thousandth of the range. The number of vertical pixels is 2048, and as expected if I move to 4.99
one of these values gets incorrectly marked valid. Notice that for this test to pass, all the lines going to -0.7
in the preceding and following dimensions should be grey, and all the lines going to 0.7
should be colored.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't notice any performance issue with making the filter mask a fixed 2048 height, since it only has a width of 8. In principle we should be able to double that if more precision is required.
@etpinard I need to clean up some syntax and other test issues, but calling this reviewable as tests pass locally. Have a look! |
…ontexts turns out my test case had unrealistic unique sorted data...
hopefully to avoid hangup due to webgl context destroy/create overload
@@ -8,7 +8,7 @@ var attributes = require('@src/traces/parcoords/attributes'); | |||
var createGraphDiv = require('../assets/create_graph_div'); | |||
var delay = require('../assets/delay'); | |||
var destroyGraphDiv = require('../assets/destroy_graph_div'); | |||
var fail = require('../assets/fail_test'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI fail
is a jasmine global, we were overloading it here, so among other things if you don't bring in assets/fail_test
but you thought you had, linters won't catch it. We should probably change that in the rest of our test suites too...
I can’t figure out what’s going on with these tests. Seems like since I |
naive solution, cleaner one coming next, just want to see if this lets our CI tests run
and try reinstating my new tests from noCI-land
@etpinard check it out, I added click-to-select for ordinal ranges. Should be ready to go. |
All right. Well, this is great! We should probably rename this PR Parcoords 2.0. That click-to-select functionality is pretty sweet: I won't pretend to understand more than 10% of the parcoods code (I would appreciate a walkthrough at some point 😉 ), but after some thorough testing, this thing deserves a 💃 |
The branch so far contains the SVG part but not yet the WebGL part. Upcoming commits will have
constraintrange
is diverted to a new meaning-> note: current
attributes.js
specification forconstraintrange
is just a hint for the current structure but not what it'll end up being, e.g. it now assumes two intervalsChanges:
Due to the need for supporting very numerous intervals (up to 1000 per axis) and the possibility for up to 60 axes, the SVG solution has to be economical, otherwise thousands of DOM elements would be added. I found that SVG slowness has a nontrivial component that scales with concurrently existing DOM element count, roughly; other
experiments show the technique of using a single DOM element for showing a lot of glyphs. The current solution uses two layers of
stroke-dasharray
to represent filter coverage but it's fairly easy to switch to eg.path
(would need points ie. longerd
than the currentstroke-dasharray
).There subselections are not fused during drag interaction but they do get fused on mouse release.
More description to come. Code is liberally commented and glad to add more.