
(b.)  ?Jun 24, 1932
Bio/Description
An American computer scientist, he is the Edwin Howard Armstrong Professor of Computer Science at Columbia University and External Professor at the Santa Fe Institute. He has held positions at Bell Laboratories, University of Washington, Carnegie Mellon University, and Columbia, as well as sabbatical positions at Stanford, Berkeley, Princeton, California Institute of Technology, and Technical University, Munich. In 1959 he began his work on optimal iteration theory culminating in his 1964 monograph, which is still in print. Subsequently he pioneered work with Henryk Woźniakowski on computational complexity applied to continuous scientific problems (informationbased complexity). He has collaborated in creating significant new algorithms including the JenkinsTraub Algorithm for Polynomial Zeros, as well as the KungTraub, ShawTraub, and BrentTraub algorithms. One of his current research areas is continuous quantum computing. Both his research and institution building work have had a major impact on the field of Computer Science. From 1971 to 1979 he headed the Computer Science Department at Carnegie Mellon and led it from a critical period to eminence. From 1979 to 1989 he was the founding Chair of the Computer Science Department at Columbia. From 1986 to 1992 he served as founding Chair of the Computer Science and Telecommunications Board, National Academies and held the post again 20052009. He was founding EditorinChief, Journal of Complexity, in 1985 and continues in that capacity. This was probably the first journal which had complexity in the sense of computational complexity in its title. Starting with two issues and 285 pages in 1985 the Journal now publishes six issues and nearly 1000 pages. He continues as EditorinChief. He attended the Bronx High School of Science where he was captain and first Board of the chess team. After graduating from City College of New York he entered Columbia in 1954 intending to take a PhD in Physics. In 1955, on the advice of a fellow student, he visited the IBM Watson Research Lab at Columbia. At the time, this was one of the few places in the country where a student could gain access to computers. He found his proficiency for algorithmic thinking matched perfectly with computers. In 1957 he became a Watson Fellow through Columbia. His thesis was on Computational Quantum Mechanics. His 1959 Ph.D. was in Applied Mathematics since Computer Science degrees were not yet available. In 1959 he joined the Research Division of Bell Laboratories in Murray Hill, NJ. One day a colleague asked him how to compute the solution of a certain problem. He could think of a number of ways to solve the problem. What was the optimal algorithm, that is, a method which would minimize the required computational resources? To his surprise, there was no theory of optimal algorithms since the phrase computational complexity, which is the study of the minimal resources required to solve computational problems, was not introduced until 1965. He had the key insight that the optimal algorithm for solving a continuous problem depended on the available information. This was to eventually lead to the field of InformationBased Complexity. The first area for which he applied his insight was the solution of nonlinear equations. This research led to the 1964 monograph Iterative Methods for the Solution of Equations, which is still in print. In 1966 he spent a sabbatical at Stanford where he met a student named Michael Jenkins. Together they created the JenkinsTraub Algorithm for Polynomial Zeros. This algorithm is still one of the most widely used methods for this problem and is included in many textbooks. In 1970 he became a Professor at the University of Washington and in 1971 he became Head of the Carnegie Mellon Computer Science Department. The Department was quite small including Gordon Bell, Nico Haberman, Allen Newell, Raj Reddy, Herbert A. Simon, and William Wulf. Just prior to 1971 many faculty had left the Department to take positions elsewhere. Those professors who remained formed a core of worldclass scientists recognized as leaders of the discipline. By 1978 the Department had grown to some 50 teaching and research faculty. One of his Ph.D. students was H. T. Kung, now a chaired professor at Harvard. They created the KungTraub algorithm for comparing the expansion of an algebraic function. They showed that computing the first N terms was no harder than multiplying two Nth degree polynomials. This problem had been worked on by Isaac Newton who missed a key point. In 1973 he invited Henryk Woźniakowski, now a tenured professor at both Columbia and the University of Warsaw, Poland, to visit CMU. They pioneered the field of informationbased complexity, coauthoring three monographs and numerous papers. In 1978, while on sabbatical at Berkeley, he was recruited by Peter Likins to become founding Chairman of the Computer Science Department at Columbia and Edwin Howard Armstrong Professor of Computer Science. He served as chair 19791989. In 1980 he coauthored A General Theory of Optimal Algorithms, Academic Press, with Woźniakowski. This was the first research monograph on informationbased complexity. Greg Wasilkowski joined him and Woźniakowski in two more monographs Information, Uncertainty, Complexity, AddisonWesley, 1983, and InformationBased Complexity, Academic Press, 1988. In 1986, he was asked by the National Academies to form a Computer Science Board. The original name of the Board was the Computer Science and Technology Board (CSTB). Several years later CSTB was asked to also be responsible for telecommunications so it was renamed the Computer Science and Telecommunications Board, preserving the abbreviation CSTB. The Board deals with critical national issues in computer science and telecommunications. He served as Founding Chair 19861992 and held the post again 20052009. In 1990 he taught in the summer school of the Santa Fe Institute(SFI). He has since played a variety of roles at SFI. In the nineties he organized a series of Workshops on Limits to Scientific Knowledge funded by the Alfred P. Sloan Foundation. The goal was to enrich science in the same way that the work of GÃ¶del and Turing on the limits of mathematics enriched that field. There were a series of Workshops on limits in various disciplines: physics, economics, and geophysics. Currently he is an External Professor at SFI. Starting in 1991 he has been coorganizer of an international Seminar on "Continuous Algorithms and Complexity" at Schloss Dagstuhl, Germany. The ninth Seminar was held in September 2006. Many of the Seminar talks are on informationbased complexity and more recently on continuous quantum computing. He was invited by the Accademia Nazionale dei Lincee in Rome, Italy, to present the 1993 Lezione Lincee. He chose to give the cycle of six lectures at the Scuola Normale in Pisa. He invited Arthur Werschulz to join him in publishing the lectures. The lectures appeared in expanded form in Complexity and Information, Cambridge University Press, 1998. In 1994 he asked a Ph.D. student, Spassimir Paskov, to compare the Monte Carlo method (MC) with the QuasiMonte Carlo method (QMC) when calculating a collateralized mortgage obligation (CMO) he had obtained from Goldman Sachs. This involved the numerical approximation of a number of integrals in 360 dimensions. To the surprise of the research group Paskov reported that QMC always beat MC for this problem. People in finance had always used MC for such problems and the experts in number theory believed QMC should not be used for integrals of dimension greater than 12. He and Paskov reported their results to a number of Wall Street firms to considerable initial skepticism. They first published the results in â€œPaskov and Traub Faster Evaluation of Financial Derivatives, Journal of Portfolio Managementâ€ 22, 1995, 113120. The theory and software was greatly improved by Anargyros Papageorgiou. Today QMC is widely used in the financial sector to value financial derivatives. QMC is not a panacea for all high dimensional integrals. Research is continuing on the characterization of problems for which QMC is superior to MC. In 1999 he received the Mayor's medal for Science and Technology. Decisions regarding this award are made by the New York Academy of Sciences. The medal was awarded by Mayor Rudy Giuliani in a ceremony in Gracie Mansion, the home of New York City's mayor. Moore's law is an empirical observation that the number of features on a chip doubles roughly every 18 months. This has held since the early 60â€™s and is responsible for the computer and telecommunications revolution. It is widely believed that Moore's law will cease to hold in 10â€“15 years using silicon technology. There is therefore interest in creating new technologies. One candidate is quantum computing. That is building a computer using the principles of quantum mechanics. He and his colleagues decided to work on continuous quantum computing. The motivation is that most problems in physical science, engineering, and mathematical finance have continuous mathematical models. In 2005 he donated some 100 boxes of archival material to the Carnegie Mellon University Library. This collection is being digitized. The U.S. patents US5940810 and US0605837 were issued to (Traub et al.) for the FinDer Software System and were assigned to Columbia University. These patents cover an application of a wellknown technique (low discrepancy sequences) to a wellknown problem (valuation of securities). Besides those already mentioned, he is the recipient of numerous honors and distinctions, to including, but not limited to: Member, National Academy of Engineering, 1985; Sherman Fairchild Distinguished Scholar, California Institute of Technology, 19912; Distinguished Senior Scientist Award, Alexander von Humboldt Foundation, 1992, 1998; Member, Scientific Council, Institut en Recherche en Informatique, Paris, France, 1976â€“1980; First Prize, Ministry of Education, Poland, for the research monograph "InformationBased Complexity", 1989; 1991 Emanuel R. Piore Medal, IEEE; 1992 Distinguished Service Award, Computing Research Association; Fellow: American Association for the Advancement of Science, 1971; ACM 1994; New York Academy of Sciences, 1999; American Mathematical Society, 2012; Festschrift for Joseph F. Traub, Academic Press, 1993, Elsevier, 2004; and Honorary Doctorate of Science, University of Central Florida, 2001. He is the author or editor of ten monographs and some 120 papers in computer science, mathematics, physics, finance, and economics, including: Iterative Methods for the Solution of Equations, Prentice Hall, 1964. Reissued Chelsea Publishing Company, 1982; Russian translation MIR, 1985; Reissued Amarican Mathematical Society, 1998; Algorithms and Complexity: New Directions and Recent Results, (editor) Academic Press, 1976; InformationBased Complexity, Academic Press, 1988 (with G. Wasilkowski and H. Woźniakowski); and Complexity and Information, Cambridge University Press, 1998 (with A. G. Werschulz); Japanese translation, 2000. He is the author or coauthor of several papers; just a few of which are: â€œVariational Calculations of the 23 P State of Heliumâ€, Phys. Rev. 116, 1959, 914919; â€œThe Future of Scientific Journalsâ€, Science 158, 1966, 11531159 (with W. S. Brown and J. R. Pierce); â€œComputational Complexity of Iterative Processesâ€, SIAM Journal on Computing 1, 1972, 167179; â€œParallel Algorithms and Parallel Computational Complexityâ€, Proceedings IFIP Congress, 1974, 685687; â€œOn the Complexity of Composition and Generalized Composition of Power Seriesâ€, SIAM Journal on Computing 9, 1980, 5466 (with R. Brent); â€œInformationBased Complexity, Nature 327, July, 1987, 2933 (with E. Packel); and â€œPath Integration on a Quantum Computerâ€, Quantum Information Processing, 2003, 365388 (with H. Woźniakowski). He lives in Manhattan and Santa Fe with his wife, noted author Pamela McCorduck whose books include Machines Who think, The Fifth Generation, The Universal Machine, Aaron's Code and The Futures of Women. He has two daughters, Claudia TraubCooper and Hillary Spector.

Date of Birth:
Jun 24, 1932 
Gender:
Male 
Noted For:
Copioneer of informationbased computational complexity; collaborating in creating significant new algorithms 
Category of Achievement:

More Info: