Inference techniques for stochastic nonlinear system identification with application to the Wiener-Hammerstein models

Sammanfattning: Stochastic nonlinear systems are a specific class of nonlinear systems where unknown disturbances affect the system's output through a nonlinear transformation. In general, the identification of parametric models for this kind of systems can be very challenging. A main statistical inference technique for parameter estimation is the Maximum Likelihood estimator. The central object of this technique is the likelihood function, i.e. a mathematical expression describing the probability of obtaining certain observations for given values of the parameter. For many stochastic nonlinear systems, however, the likelihood function is not available in closed-form. Several methods have been developed to obtain approximate solutions to the Maximum Likelihood problem, mainly based on the Monte Carlo method. However, one of the main difficulties of these methods is that they can be computationally expensive, especially when they are combined with numerical optimization techniques for likelihood maximisation. This thesis can be divided in three parts. In the first part, a background on the main statistical techniques for parameter estimation is presented. In particular, two iterative methods for finding the Maximum Likelihood estimator are introduced. They are the gradient-based and the Expectation-Maximisation algorithms. In the second part, the main Monte Carlo methods for approximating the Maximum Likelihood problem are analysed. Their combinations with gradient-based and Expectation-Maximisation algorithms is considered. For ensuring convergence, these algorithms require the use of enormous Monte Carlo effort, i.e. the number of random samples used to build the Monte Carlo estimates. In order to reduce this effort and make the algorithms usable in practice, iterative solutions solutions alternating \emph{local} Monte Carlo approximations and maximisation steps are derived. In particular, a procedure implementing an efficient samples simulation across the steps of a Newton's method is developed. The procedure is based on the sensitivity of the parameter search with respect to the Monte Carlo samples and it results into an accurate and fast algorithm for solving the MLE problem. The considered Maximum Likelihood estimation methods proceed through local explorations of the parameter space. Hence, they have guaranteed convergence only to a local optimizer of the likelihood function. In the third part of the thesis, this issue is addressed by deriving initialization algorithms. The purpose is to generate initial guesses that increase the chances of converging to the global maximum. In particular, initialization algorithms are derived for the Wiener-Hammerstein model, i.e. a nonlinear model where a static nonlinearity is sandwiched between two linear parts. For this type of model, it can be proved that the best linear approximation of the system provides a consistent estimates of the two linear parts. This estimate is then used to initialize a Maximum Likelihood Estimation problem in all model parameters.

  KLICKA HÄR FÖR ATT SE AVHANDLINGEN I FULLTEXT. (PDF-format)