Context:
Designing a B-tree database.
I’ve a desk with say 12 columns (A,B,C..). These columns could be varchar or int in any order (in case of a number of tables).
When operating a question any of these columns could be constrained to a price, however not essentially all.
And I can’t statistically predict which of them shall be set. The values in columns are random unpredictable/trigger fragmentation.
If the dataset is large enough (a number of gigas and up) to be prepared to keep away from linear scanning, what can be one of the best indexing technique right here?
Methods:
1.ought to I create a desk for every prospects (in actual fact making a composite index per desk) and replace it on alteration? (rigorously updating different associated tables as effectively)
2.ought to I maintain one desk however a number of composite indexes, with a null possibility within the tree to leap to the subsequent tree? (this has my choice)
3.?
Clarifications:
Insert/replace will not be time constrained. Search must be as quick as attainable.
Once more, not considering sql queries, simply your opinion/expertise.
Thanks