Question

Situation

Suppose we are given n points on the unit square [0, 1]x[0, 1] and a positive real number r. We define the graph G(point 1, point 2, ..., point n, r) as the graph on vertices {1, 2, ..., n} such that there is an edge connecting two given vertices if and only if the distance between the corresponding points is less than or equal to r. (You can think of the points as transmitters, which can communicate with each other as long as they are within range r.)

Given n points on the unit square [0, 1]x[0, 1], we define the connectivity distance as the smallest possible r for which G(point 1, point 2, ..., point n, r) is connected.

Problem 1) find an algorithm that determines if G(point 1, point 2, ..., point n, r) is connected

Problem 2) find an algorithm that finds the connectivity distance for any n given points

My partial solution

I have an algorithm (Algorithm 1) in mind for problem 1. I haven't implemented it yet, but I'm convinced it works. (Roughly, the idea is to start from vertex 1, and try to reach all other vertices through the edges. I think it would be somewhat similar to this.)

All that remains is problem 2. I also have an algorithm in mind for this one. However, I think it is not efficient time wise. I'll try to explain how it works:

You must first convince yourself that the connectivity distance rmin is necessarily the distance between two of the given points, say p and q. Hence, there are at most *n**(n-1)/2 possible values for rmin.

So, first, my algorithm would measure all *n**(n-1)/2 distances and store them (in an array in C, for instance) in increasing order. Then it would use Algorithm 1 to test each stored value (in increasing order) to see if the graph is connected with such range. The first value that does the job is the answer, rmin.

My question is: is there a better (time wise) algorithm for problem 2?

Remarks: the points will be randomly generated (something like 10000 of them), so that's the type of thing the algorithm is supposed to solve "quickly". Furthermore, I'll implement this in C. (If that makes any difference.)

Was it helpful?

Solution

Here is an algorithm which requires O(n2) time and O(n) space.

It's based on the observation that if you partition the points into two sets, then the connectivity distance cannot be less than the distance of the closest pair of points one from each set in the partition. In other words, if we build up the connected graph by always adding the closest point, then the largest distance we add will be the connectivity distance.

  • Create two sets, A and B. Put a random point into A and all the remaining points into B.

  • Initialize r (the connectivity distance) to 0.

  • Initialize a map M with the distance to every point in B of the point in A.

  • While there are still points in B:

    • Select the point b in B whose distance M[b] is the smallest.

    • If M[b] is greater than r, set r to M[b]

    • Remove b from B and add it to A.

    • For each point p in M:

      • If p is b, remove it from M.

      • Otherwise, if the distance from b to p is less than M[p], set M[p] to that distance.

When all the points are in A, r will be the connectivity distance.

Each iteration of the while loop takes O(|B|) time, first to find the minimum value in M (whose size is equal to the size of B); second, to update the values in M. Since a point is moved from B to A in each iteration, there will be exactly n iterations, and thus the total execution time is O(n2).


The algorithm presented above is an improvement to a previous algorithm, which used an (unspecified) solution to the bichromatic closest pair (BCP) problem to recompute the closest neighbour to A in every cycle. Since there is an O(n log n) solution to BCP, this implied a solution to the original problem in O(n2 log n). However, maintaining and updating the list of closest points is actually much simpler, and only requires O(n). Thanks to @LajosArpad for a question which triggered this line of thought.

OTHER TIPS

I think your ideas are reasonably good, however, I have an improvement for you.

In fact you build up an array based on measurement and you sort your array. Very nice. At least with not too many points.

The number of n(n-1)/2 is a logical consequence of your pairing requirement. So, for 10000 elements, you will have 49995000 elements. You will need to increase significantly the speed! Also, this number of elements would eat a lot of your memory storage.

How can you achieve greater speed?

First of all, don't build arrays. You already have an array. Secondly, you can easily solve your problem by traversing. Let's suppose you have a function, which determines whether a given distance is enough to connect all the nodes, lets call this function "valid". It is not enough, because you need to find the minimal possible value. So, if you don't have more information about the nodes prior the execution of the algorithm, then my suggestion is this solution:

lowerBound <- 0
upperBound <- infinite
i <- 0
while i < numberOfElements do
    j <- i + 1
    while j < numberOfElements do
        distance <- d(elements[i], elements[j])
        if distance < upperBound and distance > lowerBound then
            if valid(distance) then
                upperBound <- distance
            else
                lowerBound <- distance
            end if
        end if
        j <- j + 1
    end while
    i <- i + 1
end while

After traversing all the elements the value of upperBound will hold the smallest distance which still connects the network. You didn't store all the distances, as they were far too many and you have solved your problem in a single cycle. I hope you find my answer helpful.

If some distance makes graph connected, any larger distance would make it connected too. To find minimal connecting distance just sort all distances and use binary search.

Time complexity is O(n^2*log n), space complexity is O(n^2).

You can start with some small distance d then check for connectivity. If the Graph is connected, you're done, if not, increment d by a small distance then check again for connectivity.

You also need a clever algorithm to avoid O(N^2) in case N is big.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top