Kernelization
Theory of Parameterized Preprocessing

Authors:

A complete introduction to recent advances in preprocessing analysis, or kernelization, with extensive examples using a single data set.

Language: English
Cover of the book Kernelization

Subject for Kernelization

Approximative price 71.34 €

In Print (Delivery period: 14 days).

Add to cartAdd to cart
Publication date:
528 p. · 15.7x23.5 cm · Hardback
Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and important results, with accessible explanations of the most recent advances in the area, such as meta-kernelization, representative sets, polynomial lower bounds, and lossy kernelization. The text is divided into four parts, which cover the different theoretical aspects of the area: upper bounds, meta-theorems, lower bounds, and beyond kernelization. The methods are demonstrated through extensive examples using a single data set. Written to be self-contained, the book only requires a basic background in algorithmics and will be of use to professionals, researchers and graduate students in theoretical computer science, optimization, combinatorics, and related fields.
1. What is a kernel?; Part I. Upper Bounds: 2. Warm up; 3. Inductive priorities; 4. Crown decomposition; 5. Expansion lemma; 6. Linear programming; 7. Hypertrees; 8. Sunflower lemma; 9. Modules; 10. Matroids; 11. Representative families; 12. Greedy packing; 13. Euler's formula; Part II. Meta Theorems: 14. Introduction to treewidth; 15. Bidimensionality and protrusions; 16. Surgery on graphs; Part III. Lower Bounds: 17. Framework; 18. Instance selectors; 19. Polynomial parameter transformation; 20. Polynomial lower bounds; 21. Extending distillation; Part IV. Beyond Kernelization: 22. Turing kernelization; 23. Lossy kernelization.
Fedor V. Fomin is Professor of Computer Science at the Universitetet i Bergen, Norway. He is known for his work in algorithms and graph theory. He has co-authored two books, Exact Exponential Algorithms (2010) and Parameterized Algorithms (2015), and received the EATCS Nerode prizes in 2015 and 2017 for his work on bidimensionality and Measure and Conquer.
Daniel Lokshtanov is Professor of Informatics at the Universitetet i Bergen, Norway. His main research interests are in graph algorithms, parameterized algorithms, and complexity. He is a co-author of Parameterized Algorithms (2015) and is a recipient of the Meltzer prize, the Bergen Research Foundation young researcher grant, and an ERC starting grant on parameterized algorithms.
Saket Saurabh is Professor of Theoretical Computer Science at the Institute of Mathematical Sciences, Chennai, and Professor of Computer Science at the Universitetet i Bergen, Norway. He has made important contributions to every aspect of parametrized complexity and kernelization, especially to general purpose results in kernelization and applications of extremal combinatorics in designing parameterized algorithms. He is a co-author of Parameterized Algorithms (2015).
Meirav Zehavi is Assistant Professor of Computer Science at Ben-Gurion University. Her research interests lie primarily in the field of parameterized complexity. In her Ph.D. studies, she received three best student paper awards.