A Gaussian chain, when averaged over all conformations and orientations, exhibits isotropic behavior. This means it can be treated as a sphere with a radius equal to its radius of gyration, $R_g$. However, an exercise in the second chapter of Rubinstein’s Polymer Physics demonstrates that $R_g^2$ becomes asymmetric when the coordinate frame is aligned with the end-to-end vector, $\mathbf{R}_{ee}$. Specifically, if $\mathbf{R}_{ee}$ is set as the x-axis, i.e., $\mathbf{R}_{ee}=(R,0,0)^T$, the three components of $R_g^2$ are $\frac{Nb^2}{36}$, $\frac{Nb^2}{36}$ in the y and z directions, and $\frac{Nb^2}{36}+\frac{R^2}{12}$ in the x direction. Here’s a simple proof:
Consider a Gaussian chain fixed between $(0,0,0)^T$ and $\mathbf{R}_{ee}$. This forms a Brownian bridge, with a multivariate Gaussian distribution centered at $\frac{i}{N}\mathbf{R}$ and a variance of $\frac{i(N-i)}{N}b^2$. The proof is straightforward:
$P_{0\mathbf{R}}(\mathbf{r},n)=\frac{G(\mathbf{r},0,n)G(\mathbf{R},\mathbf{r},N-n)}{G(0,\mathbf{R},N)}$
where $G(a,b,n)$ represents the distribution of a Gaussian chain with segment length $n$ and ends at points $a$ and $b$. This expression represents the probability of a length-$n$ Gaussian chain starting at 0 and ending at $\mathbf{r}$, connected to another length-$(N-n)$ Gaussian chain starting at $\mathbf{R}$ and ending at $\mathbf{r}$. The entire chain is a Gaussian chain with length $N$ and end-to-end vector $\mathbf{R}_{ee}=\mathbf{R}$. It’s easy to show that the distribution of this chain:
$P_{0\mathbf{R}}(\mathbf{r},n)=G\left(\mathbf{r}-\frac{n}{N}\mathbf{R}, 0,\frac{n(N-n)}{N}\right)$
is equivalent to a Gaussian chain segment with ends at $\mathbf{r}$ and $\frac{n}{N}\mathbf{R}$, and an equivalent length of $\frac{n(N-n)}{N}$. The $R_g^2$ is then:
$\begin{align}R_g^2=&\frac{1}{2N^2}\int_0^N\langle(\mathbf{r}_i-\mathbf{r_j})^2\rangle\mathrm{d}i\mathrm{d}j\\
=&\frac{1}{N^2}\int_0^N\int_j^N \frac{(i-j)^2 R^2}{N^2}+\frac{(i-j)(N-(i-j))}{N}b^2\mathrm{d}i\mathrm{d}j\\ =&\frac{R^2+Nb^2}{12}\end{align}$
In this equivalent method, $\langle (\mathbf{r}_i-\mathbf{r}_j)^2\rangle$ is interpreted as $(\frac{i}{N}-\frac{j}{N})^2R^2+\frac{|i-j|(N-|i-j|)}{N}$ from the equivalent Gaussian chain. This is because, despite the chain being ‘fixed,’ it remains a Gaussian chain, implying translation invariance. Therefore, $\langle (\mathbf{r}_i-\mathbf{r}_j)^2\rangle$ depends only on $|i-j|$, and $P_{0\mathbf{R}}(\mathbf{r},n)$ provides the probability of segment $\mathbf{r}_n-\mathbf{r}_0$, which represents any $n$-segment on the Gaussian chain. Directly calculating $P(\mathbf{r}_i-\mathbf{r}_j)$ from the convolution of $P_{0\mathbf{R}}$ is incorrect because $\mathbf{r}_i$ and $\mathbf{r}_j$ are correlated. Convolution of $P_{0\mathbf{R}}$ introduces dependence on $i$ and $j$ in $\langle (\mathbf{r}_i-\mathbf{r}_j)^2\rangle$, violating the translation invariance of the chain.
If $R^2=Nb^2$, we find that $R_g^2=\frac{1}{6}Nb^2$. Furthermore, we have $\frac{Nb^2}{36}+\frac{R^2_{x,y,z}}{12}=\frac{Nb^2+R^2}{36}$ for each dimension. Specifically, if $\mathbf{R}_{ee}=(R,0,0)^T$, we obtain $\frac{Nb^2}{36}$ in the y and z directions and $\frac{Nb^2}{36}+\frac{R^2}{12}$ in the x-direction.
Here’s a simple simulation code:
rg2_ree = rg2 = 0
for _ in range(5000):
ch = np.random.normal(size=(1000,3))
ch = np.append(np.zeros((1,3)),np.cumsum(ch, axis=0),axis=0)
ch = ch - ch.mean(axis=0)
ree = ch[-1]-ch[0]
ree = ree/np.linalg.norm(ree)
rg2_ree += np.sum(ch.dot(ree)**2)
rg2 += np.sum(ch**2)
rg2_ree/rg2
# Result is 0.666...