In many areas in science and engineering, it is fundamental to model large-scale complex systems with dynamic properties and then to estimate or control their behavior. Moreover, for implementation of estimation and control algorithms through programming on computers, proper knowledge of the dynamical systems and their characteristics are of importance. From such a viewpoint, in this lecture, we focus on the interaction between information and systems and study the topic of advanced dynamical systems. In particular, the items covered include the following: linear systems and their state-space equation, stability, controllability and observability, feedback control, linear-quadratic (LQ) optimal control, introduction to stochastic processes, colored noises, power spectra, Kalman filters, particle filters, detailed balance, and Metropolis methods.
Goal of this lecture: In this lecture, the aim is to learn linear systems in the time domain at an advanced level. More specifically, after acquiring system characterizations based on state-space methods and control techniques, we introduce stochastic processes, which are critical in modeling systems under the influence of noises in the real world. For this purpose, we present Kalman filtering, which is the basic approach for estimation by removing the noise effects, and furthermore Markov chains Monte Carlo methods, which are important for simulating random phenomena on computers. Through exercises and homeworks, the goal is to be able to implement the algorithms via programs such as Matlab.
Organization: The lecture consists of two parts. In the first half, one should learn the basics on linear systems, their analysis and control systems design via state-space methods. In the latter half, one should gain basic knowledge on formulation and analysis of ｄｙnamical systems under random behavior, and then on Kalman filters, Markov chain Monte Carlo methods.
Linear systems; State-space representations; Stability of systems; Controllability and observability; Control systems design: Random processes; Particle filters; Detailed balance; Metropolis methods
|✔ Specialist skills||Intercultural skills||Communication skills||Critical thinking skills||✔ Practical and/or problem-solving skills|
Each lecture will be based on lecture notes and slides. Each week there will be exercise reports to check the material covered in the lectures. Students will be asked to make simulations using numerical analysis softwares (e.g., Matlab).
|Course schedule||Required learning|
|Class 1||Overview of the Lecture on Continuous Systems, Preparation for Matlab exercises||Introduction to the modeling of continuous systems. Review on linear differential equations.|
|Class 2||Overview of the Lecture, Introduction to Modern Control (I) (State-space equation and coordinate transformation)||Derive state space equations from higher order differential equations.|
|Class 3||Introduction to Modern Control (II) (Stability)||Understand the notion of stability of linear systems of general orders|
|Class 4||Controllability and Observability||Analyze linear systems and determine controllability and observability.|
|Class 5||Control Systems Design (State feedback and pole placement)||Design state feedback gains via pole placement and understand the influence of poles on system performance.|
|Class 6||Control Systems Design (II) (LQ optimal control, servo systems)||Carry out controller design via optimal control and understand the principles in servo systems design|
|Class 7||State Estimation (Observer), Control Systems Design (III) (Output feedback)||Understand state estimation techniques based on observers. Design controllers based on output feedback.|
|Class 8||Overview of the Lecture, Introduction to Stochastic Processes Intuitive understanding and formularization of stochastic processes||Review the course contents. Intuitively understand stochastic processes with a simple example. Show the formulation of stochastic processes.|
|Class 9||Basis of Stochastic Processes Convergence of the Markov Chain, Ergodicity||Understand the conditions of the convergence of the Markov Chain and the concept of ergodicity.|
|Class 10||Properties of Linear Stochastic Systems Ornstein–Uhlenbeck process, Correlation function, White noise, Colored noise, Power spectrum||Analyze the one-dimensional Ornstein–Uhlenbeck process. Derive the power spectrum of colored noise.|
|Class 11||Kalman Filter and Particle Filter I State space model, Bayes inference, Particle Filter, Exercises with Matlab||Derive a general recursive form of Bayes’ rule. Understand the particle filter. Implement the algorithms with Matlab.|
|Class 12||Kalman Filter and Particle Filter II Kalman Filter, Exercises with Matlab||Understand the Kalman Filter. Implement the algorithms with Matlab.|
|Class 13||Markov Chain Monte Carlo I Detailed balance and stationary distribution, Metropolis method, Exercises with Matlab||Prove the convergence of the Markov chain satisfying the detailed balance condition. Understand the the Metropolis method. Implement the algorithms with Matlab.|
|Class 14||Markov Chain Monte Carlo II Metropolis-Hastings method, Gibbs sampler, Exercises with Matlab||Understand the Metropolis-Hastings and Gibbs-sampler methods. Implement the algorithms with Matlab.|
To enhance effective learning, students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterwards (including assignments) for each class.
They should do so by referring to textbooks and other course material.
Specified during the lecture
・Chi-Tsong Chen, Linear System Theory and Design, Oxford University Press; 4 edition, ISBN: 978-0199959570
・Dani Gamerman, D. Gamerman, Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference (Texts in Statistical Science), Chapman & Hall, ISBN: 978-0412818202
・Christopher Bishop, Pattern Recognition and Machine Learning, Springer, ISBN: 978-0387310732
Based on reports for exercises (30%) and final exam (70%)
It is desirable that students have basic knowledge of differential equation, probability and statistics.