Double digest revisited : Complexity and approximability in the presence of noisy data

M. Cieliebak, S. Eidenbenz, G.J. Woeginger

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

3 Citations (Scopus)

Abstract

We revisit the Double Digest problem, which occurs in sequencing of large DNA strings and consists of reconstructing the relative positions of cut sites from two different enzymes: we first show that Double Digest is strongly NP-complete, improving upon previous results that only showed weak NP-completeness. Even the (experimentally more meaningful) variation in which we disallow coincident cut sites turns out to be strongly NP-complete. In a second part, we model errors in data as they occur in real-life experiments: we propose several optimization variations of Double Digest that model partial cleavage errors. We then show APX-completeness for most of these variations. In a third part, we investigate these variations with the additional restriction that conincident cut sites are disallowed, and we show that it is NP-hard to even find feasible solutions in this case, thus making it impossible to guarantee any approximation ratio at all.
Original languageEnglish
Title of host publicationProceedings of the 9th International Computing and Combinatorics Conference (COCOON'2003, Big Sky MT, USA, July 25-28, 2003)
EditorsT. Warnow, B. Zhu
Place of PublicationBerlin
PublisherSpringer
Pages519-527
ISBN (Print)3-540-40534-8
DOIs
Publication statusPublished - 2003

Publication series

NameLecture Notes in Computer Science
Volume2697
ISSN (Print)0302-9743

Fingerprint

Dive into the research topics of 'Double digest revisited : Complexity and approximability in the presence of noisy data'. Together they form a unique fingerprint.

Cite this