In this paper, the channel for the downlink of Long-Term Evolution (LTE) that uses Orthogonal Frequency Division Multiple Access (OFDMA) is modelled and estimated. The Rayleigh channel is approximated to be a Jakes process which is modelled using an autoregressive (AR) model. An iterative Kalman filtering algorithm for estimation of the time-variant Rayleigh fast fading channel is proposed. An AR channel model is used to provide the state space estimates necessary for Kalman filter based channel estimation. The Kalman algorithm, using state space concepts, computes the channel matrix which can then be used to estimate the baseband signal transmitted. Since this algorithm uses both pilot sequences and the underlying channel model to estimate the channel, they are more bandwidth efficient compared to only data-based algorithms. The channel quality index obtained in this estimation technique can be used in the dynamic allocation of subcarriers to multi-user. The performance is compared with blind channel estimation technique, subspace based Singular Value Decomposition (SVD). From the simulation results, it is verified that significant signal-to-noise ratio (SNR) gain and bit-error rate (BER) is achieved using Kalman filter compared to SVD.
cited By (since 1996)0; Conference of org.apache.xalan.xsltc.dom.DOMAdapter@6dbc868a ; Conference Date: org.apache.xalan.xsltc.dom.DOMAdapter@7425064e Through org.apache.xalan.xsltc.dom.DOMAdapter@2ac12674; Conference Code:82507
Dr. Kirthiga S. and Dr. Jayakumar M., “AutoRegressive channel modeling and estimation using Kalman filter for downlink LTE systems”, in Proceedings of the 1st Amrita ACM-W Celebration on Women in Computing in India, 2010.