|
 |
Le_Forgeron wrote:
> It might be building a cross reference/multi dimensional access: great
> for quick read access later... terrible in write mode, especially if the
> algorithm used is silly (like O(x^N) or O(exp(N))...)
The only way an index algorithm could be exponential-time is if it's
trying to "sort" the list of entries by exhaustively trying every
possible transposition until it finds the correct one. Even the very
worst sorting algorithms are only N^2 time.
> Basic tests with a small number of entries displayed no issue.... it
> just does not scale to production the size of your company!
6,000 entries is hardly "large". Indeed, if you have much less than
6,000 entries to manage, you barely need special-purpose software to
manage it.
Then again, given that this software appears to just store names and
addresses, you'd think somebody could knock up something in MS Access in
about 20 seconds flat which would do the same job. (Which begs the
question... WHY HAVEN'T THEY?!?!)
> (indexing by name, firstname, addresses, any silly idea... using a
> bubble sort on files)
>
> It might also be a very sophisticated paging system, with some entries
> per page: when a page get filled, you move all the other pages and
> redistribute the entries (using some patricia tree with extended key
> length..) of the current page between the old and a new one.... one
> entry at a time, with initial packet unsorted... or worse: in
> pathological order.
It's entirely possible that sorted order might be the worse possible
case. ;-)
Post a reply to this message
|
 |