I just bumped into the issue of having excessively large raw fastq files per sample which leads to that fastp times out with the current allocations in the rackham config. Since this is a quite unusual situation, I don't think it is necessary to change the default one but it would be good to have one more for "fat" datasets. What do you think?
I just bumped into the issue of having excessively large raw fastq files per sample which leads to that
fastptimes out with the current allocations in the rackham config. Since this is a quite unusual situation, I don't think it is necessary to change the default one but it would be good to have one more for "fat" datasets. What do you think?