square_clustering#
- square_clustering(G, nodes=None)[source]#
Compute the squares clustering coefficient for nodes.
For each node return the fraction of possible squares that exist at the node [1]
where
are the number of common neighbors of and other than (ie squares), and , where if and are connected and 0 otherwise. [2]- Parameters:
- Ggraph
- nodescontainer of nodes, optional (default=all nodes in G)
Compute clustering for nodes in this container.
- Returns:
- c4dictionary
A dictionary keyed by node with the square clustering coefficient value.
Notes
While
(triangle clustering) gives the probability that two neighbors of node v are connected with each other, is the probability that two neighbors of node v share a common neighbor different from v. This algorithm can be applied to both bipartite and unipartite networks.References
[1]Pedro G. Lind, Marta C. González, and Hans J. Herrmann. 2005 Cycles and clustering in bipartite networks. Physical Review E (72) 056127.
[2]Zhang, Peng et al. Clustering Coefficient and Community Structure of Bipartite Networks. Physica A: Statistical Mechanics and its Applications 387.27 (2008): 6869–6875. https://arxiv.org/abs/0710.0117v1
Examples
>>> G = nx.complete_graph(5) >>> print(nx.square_clustering(G, 0)) 1.0 >>> print(nx.square_clustering(G)) {0: 1.0, 1: 1.0, 2: 1.0, 3: 1.0, 4: 1.0} ----
Additional backends implement this function
- graphblasOpenMP-enabled sparse linear algebra backend.
- Additional parameters:
- chunksizeint or str, optional
Split the computation into chunks; may specify size as string or number of rows. Default “256 MiB”
- parallelA networkx backend that uses joblib to run graph algorithms in parallel. Find the nx-parallel’s configuration guide here
The nodes are chunked into
node_chunks
and then the square clustering coefficient for allnode_chunks
are computed in parallel overn_jobs
number of CPU cores.- Additional parameters:
- get_chunksstr, function (default = “chunks”)
A function that takes in a list of all the nodes (or nbunch) as input and returns an iterable
node_chunks
. The default chunking is done by slicing thenodes
inton_jobs
number of chunks.
[Source]