Backstory:
I have been searching for a highly performant way to find cliques within a network which are below a given dimension (e.g all k-cliques with k<=3 are all nodes, edges, and triangles). As this example of low dimensional cliques (k<=3 or k<=4) is often the case, I have resorted to simply looking for highly performant triangle finding methods.
Networkx is incredibly slow; however, networkit has a much more performant solution with a Cython backend.
Unfortunately, networkit does not have an algorithm for listing all cliques <= a given dimension. They have a MaximalCliques algorithm, which is different, and unfortunately simply runs for all possible dimensions of cliques in no particular order (from what I can tell). It also only counts triangles, but does not list the nodes which make up each triangle. Thus, I am writing my own function that implements a reasonably efficient method right now below.
Problem:
I have the function nk_triangles below; however, it is resisting an easy jamming into numba or Cython. Therefore, I wanted to see if anyone has more expertise in these areas that may be able to shove this towards faster speeds.
I have made a simple, yet fully workable snippet of code with the function of interest here:
import networkit as nk
import numba
from itertools import combinations
from urllib.request import urlopen
import tempfile
graph_url="https://raw.githubusercontent.com/networkit/networkit/master/input/tiny_02.graph"
big_graph_url="https://raw.githubusercontent.com/networkit/networkit/master/input/caidaRouterLevel.graph"
with tempfile.NamedTemporaryFile() as f:
with urlopen(graph_url) as r:
f.write(r.read())
f.read()
G = nk.readGraph(f.name, nk.Format.METIS)
#@numba.jit
def nk_triangles(g):
# Source:
# https://cs.stanford.edu/~rishig/courses/ref/l1.pdf
triangles = set()
for node in g.iterNodes():
ndeg = g.degree(node)
neighbors = [neigh for neigh in g.iterNeighbors(node)
if (ndeg < g.degree(neigh)) or
((ndeg == g.degree(neigh))
and node < neigh)]
node_triangles = set({(node, *c): max(g.weight(u,v)
for u,v in combinations([node,*c], 2))
for c in combinations(neighbors, 2)
if g.hasEdge(*c)})
triangles = triangles.union(node_triangles)
return triangles
tris = nk_triangles(G)
tris
The big_graph_url can be switched in to see if the algorithm is actually performing reasonably well. (My graphs are orders of magnitude larger than this still)
As it stands, this takes ~40 minutes minutes to compute my machine (single threaded python loops calling C backend code in networkit and itertools). The number of triangles in the big network is 455,062.
Here is a numpy version of your code taking ~1 min for your big graph.
Giving me