An ultra-fast all-in-one FASTQ preprocessor (QC/adapters/trimming/filtering/splitting/merging...)
MIT License
Bot releases are visible (Hide)
Published by sfchen over 1 year ago
Published by sfchen almost 3 years ago
Fix the bug in the mode of interleaved input
Published by sfchen about 3 years ago
Published by sfchen about 3 years ago
The threading and I/O modules have been completely rewritten to generate reproducible outputs and improve performance greatly. New libraries libisal
and libdeflate
were introduced to replace the slow zlib
. Although this may bring some difficulties to compilation, it is all worthwhile for performance improvement.
In many cases, fastp v0.23.0 can be 2x as faster as previous versions. Especially, when the compression level is set to 6 or higher, the performance gain is very obvious.
Threading randomness issue has been addressed, so the output files are MD5 consistent when you run it twice, which means the results are completely reproducible.
Published by sfchen about 3 years ago
Support deduplication (--dedup), and refine the duplication evaluation algorithm.
Published by sfchen over 4 years ago
https://cdn.plot.ly is not available from some places, and it will block fastp's HTML figures.
From this release, we will use opengene.org to distribute plotly.js.
Published by sfchen over 5 years ago
Revise overlap detection, PE correction and adapter trimming
Support average quality score filter
Count polyX
...
Published by sfchen over 5 years ago
Published by sfchen over 5 years ago
new -m/--merge
option to merge paired reads.
Published by sfchen over 5 years ago
Published by sfchen almost 6 years ago
Published by sfchen almost 6 years ago
1, fix an issue of adapter detection to provide higher detection rate
2, add detect_adapter_for_pe option to allow adapter detection for PE data
3, support trim to max_len
4, improve adapter trimming for the reads like adapter dimers
Published by sfchen about 6 years ago
Published by sfchen about 6 years ago
Published by sfchen over 6 years ago
2x faster and 8% smaller
Published by sfchen over 6 years ago
Published by sfchen over 6 years ago
Published by sfchen over 6 years ago
Published by sfchen over 6 years ago