This paper establishes a method for quantifying variance error in cases where the input spectral density has a rational factorisation. Compared 1 to previous work which has involved asymptotic-in-model-order techniques and yielded expressions which are only approximate for finite orders, the quantifications provided here are exact for finite model order, although they still only apply asymptotically in the observed data length. The key principle employed here is the use of a reproducing kernel in order to characterise the model class, and this implies a certain geometric-type approach to the problem.