10 Apr 2026

If you are reading this, you are probably familiar with the trace. It is defined as the sum of the diagonal entries of a matrix and therefore is linear, that is, for a field $\mathbf F$, taking $X, Y \in \mathscr M_n (\mathbf F)$, and $\alpha, \beta \in \mathbf F$ $$ \tr (\alpha X + \beta Y) = \alpha \tr X + \beta \tr Y. $$ Another, more remarkable property of the trace is that it is cyclic, meaning that a product can be commuted when taken its trace: $$ \tr (XY) = \tr (YX); $$ this result in particular makes the trace special, because we will now prove that a linear form that is cyclic must precisely be a multiple of the trace.

Given a field $\mathbf F$ and a linear form $\phi: \mathscr M_n (\mathbf F) \longrightarrow \mathbf F$, then $\phi$ is cyclic if and only if $\phi(X) = \alpha \tr X$, for some $\alpha \in \mathbf F$.
The "only if" part is well-known and relatively straightforward to prove, so you are invited to do it yourself – as a blueprint, prove the result for the canonical basis of $\mathscr M_n (\mathbf F)$, distribute the product, commute, and regroup the terms.

The less-known fact is the converse of the above, and the way to proceed is to show that $\phi(E_{ij}) = \alpha \delta_{ij}$, where $E_{ij}$ is the matrix with a 1 in the $i$th row and $j$th column and zeros everywhere else, and $\delta_{ij}$ (Kronecker delta) is 1 if $i=j$, otherwise it evaluates to 0. Notice first that $$ E_{ij} E_{ii} = \delta_{ij} E_{ii}, $$ because the $i$th row – the only one that contains a 1 – selects the elements from the $j$th row of $E_{ii}$, which is empty unless $i=j$, and the rest are clearly zeros (a few examples by hand may help). On the contrary, $$ E_{ii} E_{ij} = E_{ij}, $$ so assuming $\phi$ is cyclic, $$ \phi(E_{ij}) = \phi(E_{ii}E_{ij}) = \phi(E_{ij}E_{ii}) = \delta_{ij} \phi(E_{ii}) = \alpha_i \delta_{ij}, $$ and therefore, by linearity, only the diagonal elements of the matrix will count towards $\phi$; it now suffices to show that all $\alpha_i$'s are equal. Now denote by $\Gamma_{ij}$ the matrix resulting from interchanging the $i$th and $j$th rows of the identity matrix: $$ \Gamma_{ij} = \begin{pmatrix} 1\\ & \ddots\\ & & 0 & \cdots & 1\\ & & \vdots & \ddots & \vdots\\ & & 1 & \cdots & 0\\ & & & & & \ddots\\ & & & & & & 1 \end{pmatrix}, $$ that is, the elementary matrix corresponding to row-interchange; it is clear that $\trans \Gamma_{ij} = \Gamma_{ij}$, $\Gamma_{ij}^2 = \mathbf 1$, and it is also follows that $$ \Gamma_{ij} E_{ii} \Gamma_{ij} = \Gamma_{ij} \trans{(\trans \Gamma_{ij} \trans E_{ii})} = \Gamma_{ij} \trans{(\Gamma_{ij} E_{ii})} = \Gamma_{ij} \trans{E_{ji}} = \Gamma_{ij} E_{ij} = E_{jj}. $$ Putting everything together, $$ \alpha_i = \phi(E_{ii}) = \phi(E_{ii} \Gamma_{ij} \Gamma_{ij}) = \phi(\Gamma_{ij} E_{ii} \Gamma_{ij}) = \phi(E_{jj}) = \alpha_j, $$ so indeed $\alpha := \alpha_1 = \alpha_2 = \cdots = \alpha_n$ and by linearity $$ \phi(X) = \phi\left(\sum_{i=0}^n \sum_{j=0}^n x_{ij} E_{ij} \right) = \sum_{i=0}^n \sum_{j=0}^n x_{ij} \phi(E_{ij}) = \sum_{i=0}^n \alpha x_{ii} = \alpha \tr X. $$

This result is actually a problem I saw a long time ago in Nicholson's Linear Algebra with Applications, which I did not solve at the time but found very nice. I am glad it is finally settled for me.