Table Of Content1. Background and Preview. 2. Highlights of Classical Control Theory. 3. State Variables and the State Space Description of Dynamic Systems. 4. Fundamentals of Matrix Algebra. 5. Vectors and Linear Vector Spaces. 6. Simultaneous Linear Equations. 7. Eigenvalues and Eigenvectors. 8. Functions of Square Matrices and the Cayley-Hamilton Theorem. 9. Analysis of Continuous and Discrete Time State Equations. 10. Stability. 11. Controllability and Observability for Linear Systems. 12. The Relationship between State Variable and Transfer Function Descriptions of Systems. 13. Design of Linear Feedback Control Systems. 14. An Introduction to Optimal Control Theory. 15. An Introduction to Nonlinear Control Systems.
SynopsisA practical text/reference on modern control applications in electrical, mechanical, and aerospace engineering., Brogan's revision of this text briefly reviews modelling and classical linear control in the transform domain, then develops the linear algebra/matrix theory needed for state variable analysis. It also studies dynamical systems and their fundamental properties, design methods of pole-placement/observers and optimal control theory.