Eigenvector Centrality

Overview

Eigenvector centrality is one of several node metrics that characterize the "global" (as opposed to "local") prominence of a vertex in a graph. Others include Katz, Bonacich's power centrality, and Page Rank.

The gist of eigenvector centrality is to compute the centrality of a node as a function of the centralities of its neighbors.

What is an Eigenvector?

Consider the graph below and its 5x5 adjacency matrix, A.

eigenvector-01.gif

And then consider, x, a 5x1 vector of values, one for each vertex in the graph. In this case, we've used the degree centrality of each vertex.

eigenvector-02.gif

Now let's look at what happens when we multiply the vector x by the matrix A. The result, of course, is another 5x1 vector.

eigenvector-03.gif

If we look closely at the first element of the resulting vector we see that the 1s in the A matrix "pick up" the values of each vertex to which the first vertex is connected (in this case, the second, third, and fourth) and the resulting value is the sum of the values each of these vertices had.

In other words, what multiplication by the adjacency matrix does, is reassign each vertex the sum of the values of its neighbor vertices.

eigenvector-07.gif

This has, in effect, "spread out" the degree centrality. That this is moving in the direction of a reasonable metric for centrality can be seen better if we rearrange the graph a little bit:

eigenvector-06.gif

Suppose we multiplied the resulting vector by A again, how might we interpret what that meant? In effect, we'd be allowing this centrality value to once again "spread" across the edges of the graph. And we'd notice that the spread is in both directions (vertices both give to and get from their neighbors). We might speculate that this process might eventually reach an equilibrium when the amount coming into a given vertex would be in balance with with the amount going out to its neighbors. Since we are just adding things up, the numbers would keep getting bigger, but we could reach a point where the share of the total at each node would remain stable.

At that point we might imagine that all of the "centrality-ness" of the graph had equilibrated and the value of each node completely captured the centrality of all of its neighbors, all the way out to the edges of the graph.

Now imagine a more generic vector of values for our vertices — just (v,w,x,y,z) where the letters represent unknown values. Let's write out what multiplying this vector by the matrix A would look like

eigenvector-04.gif
A Short Aside

When we multiply a vector or matrix by just a number — we call it a "scalar" — how does it work? What for example, would be twice the vector (1,3)? The answer is simple: you just multiple each element by the scalar. Thus the result would be (2,6). The same is true for a matrix:

(1)
\begin{align} 5\times \left( \begin{array}{cc} a & b \\ c & d \end{array} \right) = \left( \begin{array}{cc} 5a & 5b \\ 5c & 5d \end{array}\right) \end{align}

And, in general, we write this "scalar multiplication" with a non-bolded variable in front of the vector or matrix. Thus, bM would mean the matrix M multiplied by the scalar b.

Back to the Graph and Centrality

So, we can write this last equation as ${\bf Mx} = {\bf y}$. But our point was that there might be some set of values on our vertices, (that is to say, some {\bf x}), for which multiplication by A would not change the relative sizes of the vector's component values — the numbers would get bigger, but all by the same factor. We can express this like this:

(2)
\begin{align} {\bf Mx} = \lambda {\bf x} \end{align}
(3)
\begin{align} {\bf M} \times \left( \begin{array}{c} x_{1} \\ x_{2} \\ ... \\x_{n} \\\end{array} \right) = \left( \begin{array}{c} \lambda x_{1} \\ \lambda x_{2} \\ ... \\\lambda x_{n} \\\end{array} \right) \end{align}

A vector with this property — it can be multiplied by the adjacency matrix for a graph and return itself multiplied by a scalar — is a characteristic of this particular adjacency matrix. In other words, there is something about the particular pattern of connections in this graph that leads to a specific set of vertex values that will have the equilibrium property described above.

This vector is called an Eigenvector of the matrix ${\bf A}.$

The elements of this vector are the Eigenvector centralities of the vertices of the graph.

References and Further Exploration
Borgatti, Stephen. Centrality for MGMT 780 at UK School of Business.