A batch least squares lattice algorithm

A fast square root batch least squares algorithm for autoregressive model structures that requires only seven floating point operations per sample per estimated parameter is derived. Memory requirements, as well as the number of floating point operations, are of order n, where n is the model order....

Full description

Saved in:
Bibliographic Details
Published in:IEEE Conference on Decision and Control pp. 3709 - 3710 vol.4
Main Author: Aling, H.
Format: Conference Proceeding
Language:English
Published: IEEE 1992
Subjects:
ISBN:9780780308725, 0780308727
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A fast square root batch least squares algorithm for autoregressive model structures that requires only seven floating point operations per sample per estimated parameter is derived. Memory requirements, as well as the number of floating point operations, are of order n, where n is the model order. The method is based on estimation of the top block row of the QR transform of the data regression matrix. This is used to derive the parameters using an order-recursive lattice algorithm, after all samples have been processed.< >
ISBN:9780780308725
0780308727
DOI:10.1109/CDC.1992.371195