Transitive Reduction - Computational Complexity

Computational Complexity

As Aho et al. show, when the time complexity of graph algorithms is measured only as a function of the number n of vertices in the graph, and not as a function of the number of edges, transitive closure and transitive reduction have the same complexity. It had already been shown that transitive closure and multiplication of Boolean matrices of size n × n had the same complexity as each other, so this result put transitive reduction into the same class. The fastest known algorithms for matrix multiplication, as of 2013, take time O(n2.3727), and this same time bound applies to transitive reduction as well.

To prove that transitive reduction is as hard as transitive closure, Aho et al. construct from a given directed acyclic graph G another graph H, in which each vertex of G is replaced by a path of three vertices, and each edge of G corresponds to an edge in H connecting the corresponding middle vertices of these paths. In addition, in the graph H, Aho et al. add an edge from every path start to every path end. In the transitive reduction of H, there is an edge from the path start for u to the path end for v, if and only if edge uv does not belong to the transitive closure of G. Therefore, if the transitive reduction of H can be computed efficiently, the transitive closure of G can be read off directly from it.

To prove that transitive reduction is as easy as transitive closure, Aho et al. rely on the already-known equivalence with Boolean matrix multiplication. They let A be the adjacency matrix of the given graph, and B be the adjacency matrix of its transitive closure (computed using any standard transitive closure algorithm). Then an edge uv belongs to the transitive reduction if and only if there is a nonzero entry in row u and column v of matrix A, and there is not a nonzero entry in the same position of the matrix product AB. In this construction, the nonzero elements of the matrix AB represent pairs of vertices connected by paths of length two or more.

When measured both in terms of the number n of vertices and the number m of edges in a directed acyclic graph, transitive reductions can also be found in time O(nm), a bound that may be faster than the matrix multiplication methods for sparse graphs. To do so, collect edges (u,v) such that the longest-path distance from u to v is one, calculating those distances by linear-time search from each possible starting vertex, u. This O(nm) time bound matches the complexity of constructing transitive closures by using depth first search or breadth first search to find the vertices reachable from every choice of starting vertex, so again with these assumptions transitive closures and transitive reductions can be found in the same amount of time.

Read more about this topic:  Transitive Reduction

Famous quotes containing the word complexity:

    The price we pay for the complexity of life is too high. When you think of all the effort you have to put in—telephonic, technological and relational—to alter even the slightest bit of behaviour in this strange world we call social life, you are left pining for the straightforwardness of primitive peoples and their physical work.
    Jean Baudrillard (b. 1929)