A comprehensive treatment of stochastic systems beginning with the foundations of probability and ending with stochastic optimal control. The book divides into three interrelated topics. First, the concepts of probability theory, random variables and stochastic processes are presented, which leads easily to expectation, conditional expectation, and discrete time estimation and the Kalman filter. With this background, stochastic calculus and continuous-time estimation are introduced. Finally, dynamic programming for both discrete-time and continuous-time systems leads to the solution of optimal stochastic control problems resulting in controllers with significant practical application. This book will be valuable to first year graduate students studying systems and control, as well as professionals in this field.
Ссылка удалена правообладателем ---- The book removed at the request of the copyright holder.