TDARMA Model Estimation Using the MLS and the TF Distribution
An approach for modeling linear time-dependent auto-regressive moving-average (TDARMA) systems using the time-frequency (TF) distribution is presented. The proposed method leads to an extension of several well-known techniques of linear time- invariant (LTI) systems to process the linear, time-varying (LTV) case. It can also be applied in the modeling of non-stationary signals. In this paper, the well-known modified least square (MLS) and the Durbin's approximation methods are adapted to this non- stationary context. A simple relationship between the generalized transfer function and the time-dependent parameters of the LTV system is derived and computer simulation illustrating the effectiveness of our method is presented, considering that the output of the LTV system is corrupted by additive noise.