We consider the optimal performance of blind separation of Gaussian sources. In practice, this estimation problem is solved by a two-step procedure: estimation of a set of covariance matrices from the observed data and approximate joint diagonalization of this set to find the unmixing matrix. Rather than studying the theoretical performance of a specific method, we are interested in the optimal attainable performance of any estimator. To do so, we consider the so-called intrinsic Cramér-Rao bound, which exploits the geometry of the parameters of the model. Unlike previous works developing a Cramér-Rao bound in this context, our solution does not require any additional hypotheses. To obtain our bound, we define and study a new Riemannian manifold holding the parameters of interest. An original estimation error measure is defined with the help of our Riemannian distance function. The corresponding Fisher information matrix is then obtained from the Fisher information metric and orthonormal bases on the tangent spaces of the manifold. Finally, our theoretical results are validated on simulated data.