Friday, August 15, 2008



One of the many uses of artificial neural networks is for nonlinear black-box dynamic modeling. This is the process of predicting the response of a dynamic system to its inputs. A number of researchers have attempted this using gradient methods of selecting the neural network parameters, to varying levels of success. However, most will tell you that the most difficult (and accuracy effecting) process is approximating the derivatives. For example, Backpropagation Through Time uses an algebraic "unrolling" of the IIR filter's derivative into its approximate FIR form, although the algebra quickly gets out of hand for systems with long impulse responses.

An easier (and arguably better) method is to use a non-derivative optimization method. My favorite is the Complex Method (matlab code), which is a simpler method of the Nelder-Mead Simplex Method. This method requires the ability to calculate a "fitness" for each set of parameters (network weights), which will be maximized. In the case of dynamic system modelling, we want to minimize the error between the neural network estimate of the system response to a given input and the real system's measured response, so we use the negative error. The fitness of a large number of paremeter sets are calculated. The worst is removed from the set, and a new set of parameters is generated, based on the fitness of the previous points. This process is repeated for a number of generations until a desired accuracy is reached.

We (myself and University of Saskatchewan colleagues Rich Burton, Doug Bitner and Greg Schoenau) have put together a paper showing how this works, to be presented at the Bath Symposium on Power Transmission and Motion Control. This paper shows the performance of the complex method when applied to the modelling of real data from a load-sensing hydraulic pump. We found the complex method to be more accurate and much faster than a previous method used on the same data.

You can get a preprint of the paper here, and the Matlab code here. As always, comments are appreciated.

No comments: