Access the full text.
Sign up today, get DeepDyve free for 14 days.
[In this chapter we introduce iterative search methods for minimizing cost functions, and in particular, the \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$J_{\mathrm MSE}$$\end{document} function. We focus on the methods of Steepest Descent and Newton-Raphson, which belong to the family of deterministic gradient algorithms. Although these methods still require knowledge of the second order statistics as does the Wiener filter, they find this solution iteratively. We also study the convergence of both algorithms and include simulation results to provide more insights on their performance. Understanding their functioning and convergence properties is very important as they will be the basis for the development of stochastic gradient adaptive filters in the next chapter.]
Published: Aug 4, 2012
Keywords: Italics; Excess Mean-square Error (EMSE); Curvature Mismatch; Slow Mode; Eigenvalue Spread
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.