Kernelization theory of parameterized preprocessing

Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and impo...

Descripción completa

Detalles Bibliográficos
Otros Autores: Fomin, Fedor V., autor (autor), Lokshtanov, Daniel, 1984- autor, Saurabh, Saket, autor, Zehavi, Meirav, autor
Formato: Libro electrónico
Idioma:Inglés
Publicado: Cambridge : Cambridge University Press 2019.
Colección:CUP ebooks.
Acceso en línea:Conectar con la versión electrónica
Ver en Universidad de Navarra:https://innopac.unav.es/record=b45481696*spi
Descripción
Sumario:Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and important results, with accessible explanations of the most recent advances in the area, such as meta-kernelization, representative sets, polynomial lower bounds, and lossy kernelization. The text is divided into four parts, which cover the different theoretical aspects of the area: upper bounds, meta-theorems, lower bounds, and beyond kernelization. The methods are demonstrated through extensive examples using a single data set. Written to be self-contained, the book only requires a basic background in algorithmics and will be of use to professionals, researchers and graduate students in theoretical computer science, optimization, combinatorics, and related fields.
Descripción Física:1 recurso electrónico (xiv, 515 páginas)
Formato:Forma de acceso: World Wide Web.
ISBN:9781107415157