tinco The trick with strongly connected components is not a heuristic (or perhaps we are using different definitions of what a heuristic is), but some heuristic is hidden little bit deeper. Normally, you consider algorithm's complexity parameterized just by the size of the input. You can look at my algorithm as additionally parameterized by the number of different quorums in the smallest strongly connected component. It can be easily proved that all minimal quorums are contained within some strongly connected component and if all quorums are intersecting, then all of the minimal quorums are contained in same strongly connected component. Thanks to this property, you can filter out all other strongly connected components. Based on this experimental code, it appears that all minimal quorums are contained within some very small strongly connected component (you can check it by yourself using the -v switch), namely of size 4. This is currently the main reason why it is so fast. But, generally the problem of finding disjoint quorums is NP-hard. I also tested it on network configurations with bigger main strongly connected component (but with small number of minimal quorums) and its performance was acceptable ?. This small size also means that the current network is very "centralized" and has very low resilience - those 4 nodes have full control of the network and if only 2 of them are down it can block the whole network.
Regarding your second question, this is the place where I used some heuristic. There are several ways of finding all quorums/all minimal quorums. Each can depend on a different parameter of the structure of the network. You can follow quorum slices using some strategy similar to DFS, returning a value every time you find that the current set is a quorum/minimal quorum. If each node has a small number of quorum slices (small quorum sets, big/small thresholds), then this method can be very efficient. Unfortunately, it can also visit same quorums several times. Another way, which I'm using, is a standard enumeration method: you consider all nodes in some order and recursively solve two subproblems - one where you decide to remove a node and another where you decide to leave it in your set. You also maintain two sets, one containing nodes you are going to consider next and other containing all selected nodes. Then you keep track of whether all selected nodes are part of the biggest quorum included in the set of "selected" plus "available" nodes. My heuristic selects which node should be considered next based on how many other nodes are pointing at it (plus some randomization). Luckily, if a node is "popular" it should be in some minimal quorum (but not necessarily). This way I'm trying to process only minimal quorums. The complexity of this method is something like (#of quorums in the main strongly connected component * nO(1)) (which in worst case can be exponential). Space complexity is linear.
In my previous post I already pointed https://github.com/stellar/stellar-core/blob/master/src/history/InferredQuorum.cpp. This probably has exponential complexity regardless of actual number of quorums. It also looks like it has exponential worst case space complexity, by keeping list of all discovered quorums. Additionally, it also process that list twice, instead of just processing it once and checking complementary for each set. Its complexity looks for me like: time = 2n * nO(1) (always), space worst case = O(2n ).