edge_betweenness_centrality#
- edge_betweenness_centrality(G, k=None, normalized=True, weight=None, seed=None)[source]#
Compute betweenness centrality for edges.
Betweenness centrality of an edge \(e\) is the sum of the fraction of all-pairs shortest paths that pass through \(e\).
\[c_B(e) = \sum_{s, t \in V} \frac{\sigma(s, t | e)}{\sigma(s, t)}\]where \(V\) is the set of nodes, \(\sigma(s, t)\) is the number of shortest \((s, t)\)-paths, and \(\sigma(s, t | e)\) is the number of those paths passing through edge \(e\) [1]. The denominator \(\sigma(s, t)\) is a normalization factor that can be turned off to get the raw path counts.
- Parameters:
- Ggraph
A NetworkX graph.
- kint, optional (default=None)
If
k
is notNone
, usek
sampled nodes as sources for the considered paths. The resulting sampled counts are then inflated to approximate betweenness. Higher values ofk
give better approximation. Must havek <= len(G)
.- normalizedbool, optional (default=True)
If
True
, the betweenness values are rescaled by dividing by the number of possible \((s, t)\)-pairs in the graph.- weightNone or string, optional (default=None)
If
None
, all edge weights are 1. Otherwise holds the name of the edge attribute used as weight. Weights are used to calculate weighted shortest paths, so they are interpreted as distances.- seedinteger, random_state, or None (default)
Indicator of random number generation state. See Randomness. Note that this is only used if
k is not None
.
- Returns:
- edgesdict
Dictionary of edges with betweenness centrality as the value.
See also
Notes
The algorithm is from Ulrik Brandes [1].
For weighted graphs the edge weights must be greater than zero. Zero edge weights can produce an infinite number of equal length paths between pairs of nodes.
References
[1] (1,2)Ulrik Brandes: On Variants of Shortest-Path Betweenness Centrality and their Generic Computation. Social Networks 30(2):136–145, 2008. https://doi.org/10.1016/j.socnet.2007.11.001
Examples
Consider an undirected 3-path. Each pair of nodes has exactly one shortest path between them. Since the graph is undirected, only ordered pairs are counted. Each edge has two shortest paths passing through it. As such, the raw counts should be
{(0, 1): 2, (1, 2): 2}
.>>> G = nx.path_graph(3) >>> nx.edge_betweenness_centrality(G, normalized=False) {(0, 1): 2.0, (1, 2): 2.0}
With normalization, the values are divided by the number of ordered \((s, t)\)-pairs, which is \(n(n-1)/2\). For the 3-path, this is \(3(3-1)/2 = 3\).
>>> nx.edge_betweenness_centrality(G, normalized=True) {(0, 1): 0.6666666666666666, (1, 2): 0.6666666666666666}
For a directed graph, all \((s, t)\)-pairs are considered. The normalization factor is \(n(n-1)\) to reflect this.
>>> DG = nx.path_graph(3, create_using=nx.DiGraph) >>> nx.edge_betweenness_centrality(DG, normalized=False) {(0, 1): 2.0, (1, 2): 2.0} >>> nx.edge_betweenness_centrality(DG, normalized=True) {(0, 1): 0.3333333333333333, (1, 2): 0.3333333333333333}
Computing the full edge betweenness centrality can be costly. This function can also be used to compute approximate edge betweenness centrality by setting
k
. This determines the number of source nodes to sample.Since the partial sums only include
k
terms, instead ofn
, we multiply them byn / k
, to approximate the full sum. As the sets of sources and targets are not the same anymore, paths have to be counted in a directed way. We thus count each as half a path. This ensures that the results approximate the standard betweenness fork == n
.For instance, in the undirected 3-path graph case, setting
k = 2
(withseed=42
) selects nodes 0 and 2 as sources. This means only shortest paths starting at these nodes are considered. The raw counts are{(0, 1): 3, (1, 2): 3}
. Accounting for the partial sum and applying the undirectedness half-path correction, we get>>> nx.edge_betweenness_centrality(G, k=2, normalized=False, seed=42) {(0, 1): 2.25, (1, 2): 2.25}
When normalizing, we instead want to divide by the total number of \((s, t)\)-pairs. This is \(k(n-1)\), which is \(4\) in our case.
>>> nx.edge_betweenness_centrality(G, k=2, normalized=True, seed=42) {(0, 1): 0.75, (1, 2): 0.75} ----
Additional backends implement this function
- cugraphGPU-accelerated backend.
weight
parameter is not yet supported, and RNG with seed may be different.- parallelA networkx backend that uses joblib to run graph algorithms in parallel. Find the nx-parallel’s configuration guide here
The parallel computation is implemented by dividing the nodes into chunks and computing edge betweenness centrality for each chunk concurrently.
- Additional parameters:
- get_chunksstr, function (default = “chunks”)
A function that takes in a list of all the nodes as input and returns an iterable
node_chunks
. The default chunking is done by slicing thenodes
inton_jobs
number of chunks.
[Source]