Probability and Computing (2nd Ed., Revised edition)
Randomization and Probabilistic Techniques in Algorithms and Data Analysis

Authors:

This greatly expanded new edition offers a comprehensive introduction to randomization and probabilistic techniques in modern computer science.

Language: English
Cover of the book Probability and Computing

Subject for Probability and Computing

Approximative price 64.99 €

In Print (Delivery period: 14 days).

Add to cartAdd to cart
Publication date:
484 p. · 18.2x26 cm · Hardback
Greatly expanded, this new edition requires only an elementary background in discrete mathematics and offers a comprehensive introduction to the role of randomization and probabilistic techniques in modern computer science. Newly added chapters and sections cover topics including normal distributions, sample complexity, VC dimension, Rademacher complexity, power laws and related distributions, cuckoo hashing, and the Lovasz Local Lemma. Material relevant to machine learning and big data analysis enables students to learn modern techniques and applications. Among the many new exercises and examples are programming-related exercises that provide students with excellent training in solving relevant problems. This book provides an indispensable teaching tool to accompany a one- or two-semester course for advanced undergraduate students in computer science and applied mathematics.
1. Events and probability; 2. Discrete random variables and expectations; 3. Moments and deviations; 4. Chernoff and Hoeffding bounds; 5. Balls, bins, and random graphs; 6. The probabilistic method; 7. Markov chains and random walks; 8. Continuous distributions and the Polsson process; 9. The normal distribution; 10. Entropy, randomness, and information; 11. The Monte Carlo method; 12. Coupling of Markov chains; 13. Martingales; 14. Sample complexity, VC dimension, and Rademacher complexity; 15. Pairwise independence and universal hash functions; 16. Power laws and related distributions; 17. Balanced allocations and cuckoo hashing.
Michael Mitzenmacher is a Professor of Computer Science in the School of Engineering and Applied Sciences at Harvard University, Massachusetts. Professor Mitzenmacher has authored or co-authored over 200 conference and journal publications on a variety of topics, including algorithms for the internet, efficient hash-based data structures, erasure and error-correcting codes, power laws, and compression. His work on low-density parity-check codes shared the 2002 IEEE Information Theory Society Best Paper Award and won the 2009 ACM SIGCOMM Test of Time Award. He was elected as the Chair of the ACM Special Interest Group on Algorithms and Computation Theory in 2015.
Eli Upfal is a Professor of Computer Science at Brown University, where he was also the department chair from 2002 to 2007. Prior to joining Brown in 1998, he was a researcher and project manager at the IBM Almaden Research Center, and a professor at the Weizmann Institute of Science, Israel. His main research interests are randomized algorithms, probabilistic analysis of algorithms, and computational statistics, with applications ranging from combinatorial and stochastic optimization, massive data analysis and sampling complexity to computational biology, and computational finance.