Calculating closeness centrality using Jung

2k Views Asked by At

I am developing a semantic web application, and using Jung library to apply some graphs calculations, such as closeness, betweenness, etc. I was able to find the betweenness value for each node in my rdf and normalized it too. However, this is not the case with the ClosenessCentrality as I got NaN (not a number) score for some nodes.Below is my code:

int n =  graph.getVertexCount();// number of vertex 

double d = (double)(n-1)*(n-2)/2.0d; // this is to normalize the node value 

System.out.println("Applying ClosenessCentrality");

ClosenessCentrality<RDFNode, Statement> closeness = new ClosenessCentrality<RDFNode, Statement>(graph);

double[] closenessValues = new double[n];

Collection<RDFNode> closenessVertices = graph.getVertices();

int i = 0;

for (RDFNode vertex : closenessVertices)

closenessValues[i++] = closeness.getVertexScore(vertex) / d; // get the normalized score for each node

for (double score : closenessValues)

System.out.println(score);  // print all values.

So, as I mentioned before for some reason I got NAN score for some nodes. I feel that there is a bug on the ClosenessCentrality algorithm implementation as I got NaN. Any explanation guys ? am I doing something wrong ?

Thanks for the help

2

There are 2 best solutions below

1
On

I'd have to recheck the code, but I'll bet that the closeness centrality value may do something weird if the vertex in question appears on no shortest paths (because it's a disconnected vertex or has no incoming edges). I'd check that first.

1
On

if there's no edge to any other node from a vertex, then the closeness centrality of that vertex would be divided by 0. And NaN is the result. That's why you get the NaN for some vetex.