Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A Rapid Introduction to Adaptive FilteringIterative Optimization

A Rapid Introduction to Adaptive Filtering: Iterative Optimization [In this chapter we introduce iterative search methods for minimizing cost functions, and in particular, the \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$J_{\mathrm MSE}$$\end{document} function. We focus on the methods of Steepest Descent and Newton-Raphson, which belong to the family of deterministic gradient algorithms. Although these methods still require knowledge of the second order statistics as does the Wiener filter, they find this solution iteratively. We also study the convergence of both algorithms and include simulation results to provide more insights on their performance. Understanding their functioning and convergence properties is very important as they will be the basis for the development of stochastic gradient adaptive filters in the next chapter.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

A Rapid Introduction to Adaptive FilteringIterative Optimization

Loading next page...
 
/lp/springer-journals/a-rapid-introduction-to-adaptive-filtering-iterative-optimization-ZHuoJL0qDV
Publisher
Springer Berlin Heidelberg
Copyright
© The Author(s) 2013
ISBN
978-3-642-30298-5
Pages
19 –31
DOI
10.1007/978-3-642-30299-2_3
Publisher site
See Chapter on Publisher Site

Abstract

[In this chapter we introduce iterative search methods for minimizing cost functions, and in particular, the \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$J_{\mathrm MSE}$$\end{document} function. We focus on the methods of Steepest Descent and Newton-Raphson, which belong to the family of deterministic gradient algorithms. Although these methods still require knowledge of the second order statistics as does the Wiener filter, they find this solution iteratively. We also study the convergence of both algorithms and include simulation results to provide more insights on their performance. Understanding their functioning and convergence properties is very important as they will be the basis for the development of stochastic gradient adaptive filters in the next chapter.]

Published: Aug 4, 2012

Keywords: Italics; Excess Mean-square Error (EMSE); Curvature Mismatch; Slow Mode; Eigenvalue Spread

There are no references for this article.