You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The space (and time for construction) complexity for a Db for a type that has many (but not all) unique #[venndb(filter)] fields appears to be O(n^2), which makes constructing (and holding) such a Db quickly infeasible for collection with more than ~100,000 unique fields.
Among other solutions, I wonder if we could use some sort of compressed bitset implementation, like roaring or hi_sparse_bitset to better handle this scenario with minimal to no performance loss in the "large number of total entries, small number of unique filter field values" case.
If we're not interested in adjusting the current implementation, this could also be something gated behind a feature flag or configured by some derive attribute.
I'd be happy to take a look at feasibility if it's something that would be considered 😃
The text was updated successfully, but these errors were encountered:
The space (and time for construction) complexity for a Db for a type that has many (but not all) unique
#[venndb(filter)]
fields appears to beO(n^2)
, which makes constructing (and holding) such a Db quickly infeasible for collection with more than ~100,000 unique fields.Among other solutions, I wonder if we could use some sort of compressed bitset implementation, like roaring or hi_sparse_bitset to better handle this scenario with minimal to no performance loss in the "large number of total entries, small number of unique filter field values" case.
If we're not interested in adjusting the current implementation, this could also be something gated behind a feature flag or configured by some derive attribute.
I'd be happy to take a look at feasibility if it's something that would be considered 😃
The text was updated successfully, but these errors were encountered: