This paper considers a Bayesian approach to linear system identification. One motivation is the advantage of the minimum mean square error of the associated conditional mean estimate. A further motivation is the error quantifications afforded by the posterior density which are not reliant on asymptotic in data length derivations. To compute these posterior quantities, this paper derives and illustrates a Gibbs sampling approach, which is a randomized algorithm in the family of Markov chain Monte Carlo methods. We provide details on a numerically robust implementation of the Gibbs sampler. In a numerical example, the proposed method is illustrated to give good convergence properties without requiring any user tuning.