Optimal Control Systems (Electrical Engineering Handbook)
164,40 €
Tellimisel
Tarneaeg:
2-4 nädalat
Tootekood
9780849308925
Kurzbeschreibung:
Description:
The theory of optimal control systems has grown and flourished since the 1960's. Many texts, written on varying levels of sophistication, have been published on the subject. Yet, even those purportedly designed for beginners in the field are often riddled with complex theorems, and many treatments fail to include topics that are essential to a ...
Description:
The theory of optimal control systems has grown and flourished since the 1960's. Many texts, written on varying levels of sophistication, have been published on the subject. Yet, even those purportedly designed for beginners in the field are often riddled with complex theorems, and many treatments fail to include topics that are essential to a ...
Kurzbeschreibung:
Description:
The theory of optimal control systems has grown and flourished since the 1960's. Many texts, written on varying levels of sophistication, have been published on the subject. Yet, even those purportedly designed for beginners in the field are often riddled with complex theorems, and many treatments fail to include topics that are essential to a thorough grounding in the various aspects of and approaches to optimal control.'Optimal Control Systems' provides a comprehensive but accessible treatment of the subject with just the right degree of mathematical rigor to be complete but practical. It provides a solid bridge between 'traditional' optimization using the calculus of variations and what is called 'modern' optimal control. It also treats both continuous-time and discrete-time optimal control systems, giving students a firm grasp on both methods.Among this book's most outstanding features is a summary table that accompanies each topic or problem and includes a statement of the problem with a step-by-step solution. Students will also gain valuable experience in using industry-standard MATLAB and SIMULINK software, including the Control System and Symbolic Math Toolboxes. Diverse applications across fields from power engineering to medicine make a foundation in optimal control systems an essential part of an engineer's background. This clear, streamlined presentation is ideal for a graduate level course on control systems and as a quick reference for working engineers.
Table of Contents:
INTRODUCTION Classical and Modern Control Optimization Optimal Control Historical Tour About This Book Chapter Overview Problems CALCULUS OF VARIATIONS AND OPTIMAL CONTROL Basic Concepts Optimum of a Function and a Functional The Basic Variational Problem The Second Variation Extrema of Functions with Conditions Extrema of Functionals with Conditions Variational Approach to Optimal Systems Summary of Variational Approach Problems LINEAR QUADRATIC OPTIMAL CONTROL SYSTEMS I Problem Formulation Finite-Time Linear Quadratic Regulator Analytical Solution to the Matrix Differential Riccati Equation Infinite-Time LQR System I Infinite-Time LQR System II Problems LINEAR QUADRATIC OPTIMAL CONTROL SYSTEMS II Linear Quadratic Tracking System: Finite-Time Case LQT System: Infinite-Time Case Fixed-End-Point Regulator System Frequency-Domain Interpretation Problems DISCRETE-TIME OPTIMAL CONTROL SYSTEMS Variational Calculus for Discrete-Time Systems Discrete-Time Optimal Control Systems Discrete-Time Linear State Regulator Systems Steady-State Regulator System Discrete-Time Linear Quadratic Tracking System Frequency-Domain Interpretation Problems PONTRYAGIN MINIMUM PRINCIPLE Constrained Systems Pontryagin Minimum Principle Dynamic Programming The Hamilton-Jacobi-Bellman Equation LQR System using H-J-B Equation CONSTRAINED OPTIMAL CONTROL SYSTEMS Constrained Optimal Control TOC of a Double Integral System Fuel-Optimal Control Systems Minimum Fuel System: LTI System Energy-Optimal Control Systems Optimal Control Systems with State Constraints Problems APPENDICES Vectors and Matrices State Space Analysis MATLAB Files REFERENCES INDEX
Description:
The theory of optimal control systems has grown and flourished since the 1960's. Many texts, written on varying levels of sophistication, have been published on the subject. Yet, even those purportedly designed for beginners in the field are often riddled with complex theorems, and many treatments fail to include topics that are essential to a thorough grounding in the various aspects of and approaches to optimal control.'Optimal Control Systems' provides a comprehensive but accessible treatment of the subject with just the right degree of mathematical rigor to be complete but practical. It provides a solid bridge between 'traditional' optimization using the calculus of variations and what is called 'modern' optimal control. It also treats both continuous-time and discrete-time optimal control systems, giving students a firm grasp on both methods.Among this book's most outstanding features is a summary table that accompanies each topic or problem and includes a statement of the problem with a step-by-step solution. Students will also gain valuable experience in using industry-standard MATLAB and SIMULINK software, including the Control System and Symbolic Math Toolboxes. Diverse applications across fields from power engineering to medicine make a foundation in optimal control systems an essential part of an engineer's background. This clear, streamlined presentation is ideal for a graduate level course on control systems and as a quick reference for working engineers.
Table of Contents:
INTRODUCTION Classical and Modern Control Optimization Optimal Control Historical Tour About This Book Chapter Overview Problems CALCULUS OF VARIATIONS AND OPTIMAL CONTROL Basic Concepts Optimum of a Function and a Functional The Basic Variational Problem The Second Variation Extrema of Functions with Conditions Extrema of Functionals with Conditions Variational Approach to Optimal Systems Summary of Variational Approach Problems LINEAR QUADRATIC OPTIMAL CONTROL SYSTEMS I Problem Formulation Finite-Time Linear Quadratic Regulator Analytical Solution to the Matrix Differential Riccati Equation Infinite-Time LQR System I Infinite-Time LQR System II Problems LINEAR QUADRATIC OPTIMAL CONTROL SYSTEMS II Linear Quadratic Tracking System: Finite-Time Case LQT System: Infinite-Time Case Fixed-End-Point Regulator System Frequency-Domain Interpretation Problems DISCRETE-TIME OPTIMAL CONTROL SYSTEMS Variational Calculus for Discrete-Time Systems Discrete-Time Optimal Control Systems Discrete-Time Linear State Regulator Systems Steady-State Regulator System Discrete-Time Linear Quadratic Tracking System Frequency-Domain Interpretation Problems PONTRYAGIN MINIMUM PRINCIPLE Constrained Systems Pontryagin Minimum Principle Dynamic Programming The Hamilton-Jacobi-Bellman Equation LQR System using H-J-B Equation CONSTRAINED OPTIMAL CONTROL SYSTEMS Constrained Optimal Control TOC of a Double Integral System Fuel-Optimal Control Systems Minimum Fuel System: LTI System Energy-Optimal Control Systems Optimal Control Systems with State Constraints Problems APPENDICES Vectors and Matrices State Space Analysis MATLAB Files REFERENCES INDEX
Autor | Naidu, Desineni Subbaram |
---|---|
Ilmumisaeg | 2002 |
Kirjastus | Taylor & Francis Inc |
Köide | Kõvakaaneline |
Bestseller | Ei |
Lehekülgede arv | 464 |
Pikkus | 234 |
Laius | 234 |
Keel | American English |
Anna oma hinnang