Lattice-reduction-aided detection (LRAD) has been shown to considerably increase the performance of linear Multiple-Input Multiple-Output (MIMO) detection systems. In previous work, we introduced Adaptive Lattice Reduction as a way to exploit channel correlation to yield lattice reduction-based detectors exhibiting considerably reduced complexity. In this paper, we review Adaptive Lattice Reduction and examine the performance and complexity of this detection scheme in both temporally and frequency fading channels. For typical mobile environments, for example, we demonstrate a complexity reduction approaching 70% on a 4 x 4 system, without compromising performance.