Hi everybody. (This is my first post on the forum btw, just registered)
Anyway, I've been developing a very fast sorting algorithm. The array to which the newly sorted elements are assigned creates some extra empty elements due to the nature of the sort algorithm. In order to optimize the amount of memory allocated for this sort array, I have derived a formula for guaranteeing an optimal array size that minimizes the amount of extra empty elements generated, but the problem is that the formula assumes that I know what the least difference among all elements of the array being sorted is.
For example: if a set is 68, 7, 1, 4, 100 then the least difference is 3 because 4-1 = 3 and this is the smallest distance.
In it's most obvious implementation finding the least difference among a set of numbers seems to be an O(n^2) operation, since every element in the array would be subtracted from every other element in the array to find the distance and then assigned based on whether it is the current smallest distance (kind of like finding a min).
However, in order for my sort algorithm to be optimally effective I need this calculation to happen in O(n) time or better rather than O(n^2).
Does anyone know a clever way to calculate this value in O(n) or better time?
Alternatively, it would suffice almost just as well if the value returned was just approximately equal to the least difference among the set, provided that approximation is less than or equal to the actual real least difference.
As another alternative, perhaps simply being able to make an very good guess as to what it might be would suffice. Anyone have any good ideas for a good way to guess the value?
If this task proves impossible, then it seems I will simply have to parametrize this value as something set in advance by the user, kind of like a granularity specification. It would be okay that way, but it would be better if I could find the optimal granularity automatically for the user.
Any thoughts?
Thanks for your time and for reading.