Contact us on (02) 8445 2300
For all customer service and order enquiries

Woodslane Online Catalogues

9780898716610 Academic Inspection Copy

Linear and Nonlinear Optimization

Description
Author
Biography
Table of
Contents
Google
Preview
Introduces the applications, theory, and algorithms of linear and nonlinear optimization, with an emphasis on the practical aspects of the material. Its unique modular structure provides flexibility to accommodate the varying needs of instructors, students, and practitioners with different levels of sophistication in these topics. The succinct style of this second edition is punctuated with numerous real-life examples and exercises, and the authors include accessible explanations of topics that are not often mentioned in textbooks, such as duality in nonlinear optimization, primal-dual methods for nonlinear optimization, filter methods, and applications such as support-vector machines. Part I provides fundamentals that can be taught in whole or in part at the beginning of a course on either topic and then referred to as needed. Part II on linear programming and Part III on unconstrained optimization can be used together or separately, and Part IV on nonlinear optimization can be taught without having studied the material in Part II. In the preface the authors suggest course outlines that can be adjusted to the requirements of a particular course on both linear and nonlinear optimization, or to separate courses on these topics. Three appendices provide information on linear algebra, other fundamentals, and software packages for optimization problems. A supplemental website offers auxiliary data sets that are necessary for some of the exercises.
Igor Griva received a B.Sc. and M.S. degree in applied mathematics in 1993 and 1994 from Moscow State University, Russia; and a Ph.D. in information technology in 2002 from George Mason University, where he is now an Assistant Professor of Computational Sciences and Mathematics in the College of Science. Prior to coming to George Mason University, he was a research associate at the Department of Financial Engineering and Operations Research in Princeton University. His research focuses on theory and methods of nonlinear optimization and their application to problems in science and engineering. Stephen Nash received a B.Sc. (Honors) degree in mathematics in 1977 from the University of Alberta, Canada; and a Ph.D. in computer science in 1982 from Stanford University. He is the Program Director for the Operations Research program at the National Science Foundation, on leave from George Mason University. Dr Nash is a Professor of Systems Engineering and Operations Research in the Volgenau School of Information Technology and Engineering. Prior to coming to George Mason University, he taught at The Johns Hopkins University. He has also had professional associations with the National Institute of Standards and Technology and the Argonne National Laboratory. His research activities are centered in scientific computing, especially nonlinear optimization, along with related interests in statistical computing and optimal control. He has been a member of the editorial boards of Computers in Science & Engineering, the SIAM Journal on Scientific Computing, Operations Research, and the Journal of the American Statistical Association. Ariela Sofer received the B.Sc. in mathematics, and the M.Sc. in operations research from the Technion in Israel. She received the D.Sc. degree in operations research from the George Washington University in 1984. She is Professor and Chair of the Systems Engineering and Operations Research Department at George Mason University. Her major areas of interest are nonlinear optimization, and optimization in biomedical applications. She has been a member of the editorial boards of the journals Operations Research and Management Science, and is coeditor on a subseries of the Annals of Operations Research on Operations Research in Medicine.
Preface Part I: Basics Chapter 1: Optimization Models Chapter 2: Fundamentals of Optimization Chapter 3: Representation of Linear Constraints Part II: Linear Programming Chapter 4: Geometry of Linear Programming Chapter 5: The Simplex Method Chapter 6: Duality and Sensitivity Chapter 7: Enhancements of the Simplex Method Chapter 8: Network Problems Chapter 9: Computational Complexity of Linear Programming Chapter 10: Interior-Point Methods of Linear Programming Part III: Unconstrained Optimization Chapter 11: Basics of Unconstrained Optimization Chapter 12: Methods for Unconstrained Optimization Chapter 13: Low-Storage Methods for Unconstrained Problems Part IV: Nonlinear Optimization Chapter 14: Optimality Conditions for Constrained Problems Chapter 15: Feasible-Point Methods Chapter 16: Penalty and Barrier Methods Part V: Appendices Appendix A: Topics from Linear Algebra Appendix B: Other Fundamentals Appendix C: Software Bibliography Index
Google Preview content