This collection of historically and technically important papers follows a logical line of development from early work in mathematical control theory to studies in adaptive control processes. The book touches upon all the major themes: stability theory, feedback control, time lag, prediction theory, dynamic programming, "bang-bang" control, and maximum principles.
The book opens with J. C. Maxwell's "On Governors" and continues with "The Control of an Elastic Fluid" by H. Bateman; an essay by editors Bellman and Kalaba, "The Work of Lyapunov and Poincare"; Hurwitz's "On the Conditions Under Which an Equation Has Only Roots With Negative Real Parts"; Nyquist's "Regeneration Theory"; "Feedback -- The History of an Idea" by H. W. Bode; a paper on forced oscillations in a circuit by B. van der Pol; "Self-excited Oscillations in Dynamical Systems Possessing Retarded Action" by N. Minorsky; "An Extension of Wiener's Theory of Prediction" by Zadeh and Ragazzini; "Time Optimal Control Systems" by J. P. LaSalle; "On the Theory of Optimal Processes" by Boltyanskii, Gamkrelidze, and Pontryagin; Bellman's "On the Application of the Theory of Dynamic Programming to the Study of Control Processes"; and the editors' study "Dynamic Programming and Adaptive Processes: Mathematical Foundation." Each paper is introduced with a brief account of its significance and with some suggestions for further reading.