Adaptive beamforming is capable of separating user signals transmitted on the same carrier frequency, and thus provides a practical means of supporting multiusers in a space-division multiple-access scenario. Moreover, for the sake of further improving the achievable bandwidth efficiency, high-throughput quadrature amplitude modulation (QAM) schemes have become popular in numerous wireless network standards, notably, in the recent WiMax standard. This contribution focuses on the design of adaptive beamforming assisted detection for the employment in multiple-antenna aided multiuser systems that employ the high-order QAM signalling. Traditionally, the minimum mean square error (MMSE) design is regarded as the state-of-the-art for adaptive beamforming assisted receiver. However, the recent work (Chen et al., 2006) proposed a novel minimum symbol error rate (MSER) design for the beamforming assisted receiver, and it was demonstrated that this MSER design provides significant performance enhancement, in terms of achievable symbol error rate, over the standard MMSE design. This MSER beamforming design is developed fully in this contribution. In particular, an adaptive implementation of the MSER beamforming solution, referred to as the least symbol error rate algorithm, is investigated extensively. The proposed adaptive MSER beamforming scheme is evaluated in simulation, in comparison with the adaptive MMSE beamforming benchmark.
The ever-increasing demand for mobile communication capacity has motivated the development of antenna array assisted spatial processing techniques (Winters et al., 1994; Litva & Lo, 1996; Godara, 1997; Kohno, 1998; Winters, 1998; Petrus et al., 1998; Tsoulos, 1999; Vandenameele et al., 2001; Blogh & Hanzo, 2002; Soni et al., 2002; Paulraj et al., 2003; Paulraj et al., 2004; Tse & Viswanath, 2005) in order to further improve the achievable spectral efficiency. A specific technique that has shown real promise in achieving substantial capacity enhancements is the use of adaptive beamforming with antenna arrays (Litva & Lo, 1996; Blogh & Hanzo, 2002). Through appropriately combining the signals received by the different elements of an antenna array, adaptive beamforming is capable of separating user signals transmitted on the same carrier frequency, provided that they are separated sufficiently in the angular or spatial domain. Adaptive beamforming technique thus provides a practical means of supporting multiusers in a space-division multiple-access scenario. For the sake of further improving the achievable bandwidth efficiency, high-throughput quadrature amplitude modulation (QAM) schemes (Hanzo et al., 2004) have become popular in numerous wireless network standards. For example, the 16-QAM and 64-QAM schemes were adopted in the recent WiMax standard (IEEE 802.16). Classically, the beamforming process is carried out by minimising the mean square error (MSE) between the desired output and the actual array output, and this principle is rooted in the traditional beamforming employed in sonar and radar systems. An advantage of this minimum MSE (MMSE) beamforming design is that its adaptive implementation can readily be achieved using the well-known least mean square (LMS) algorithm, recursive least squares algorithm and many other adaptive schemes (Widrow et al., 1967; Griffiths, 1969; Reed et al., 1974; Widrow & Stearns, 1985; Ganz et al., 1990; Haykin, 1996). For potential use in downlink adaptive beamforming receiver, we will only consider the stochastic gradient-based LMS algorithm in this study owing to the computational simplicity of this adaptive algorithm. The MMSE design has been regarded as the state-of-the-art for adaptive beamforming assisted receiver, despite of the fact that, for a communication system, it is the bit error rate (BER) or symbol error rate (SER) that really matters.