Access the full text.
Sign up today, get DeepDyve free for 14 days.
[One way to construct adaptive algorithms leads to the so called Stochastic Gradient algorithms which will be the subject of this chapter. The most important algorithm in this family, the Least Mean Square algorithm (LMS), is obtained from the SD algorithm, employing suitable estimators of the correlation matrix and cross correlation vector. Other important algorithms as the Normalized Least Mean Square (NLMS) or the Affine Projection (APA) algorithms are obtained from straightforward generalizations of the LMS algorithm. One of the most useful properties of adaptive algorithms is the ability of tracking variations in the signals statistics. As they are implemented using stochastic signals, the update directions in these adaptive algorithms become subject to random fluctuations called gradient noise. This will lead to the question regarding the performance (in statistical terms) of these systems. In this chapter we will try to give a succinct introduction to this kind of adaptive filter and to its more relevant characteristics.]
Published: Aug 4, 2012
Keywords: Adaptive Algorithm; Little Mean Square; Adaptive Filter; Mean Square Deviation; Steady State Error
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.