gravatar for hannahfreund3

3 hours ago by

Hi everyone,

I am using cutadapt to cut some adapters on my PE, metagenome reads (150bpx2, Illumina NovaSeq).

Here is my code:

for i in *_R1_001.fastq;
do
    SAMPLE=$(echo ${i} | sed "s/_R1_01.fastq//")
    echo ${SAMPLE}_R1_001.fastq ${SAMPLE}_R2_001.fastq/n
    cutadapt --pair-adapters -pair-filter=both --discard-untrimmed -b ACTGCGAA -B TTCGCAGT -o ${SAMPLE}_R1_001_cut.fastq -p ${SAMPLE}_R2_001_cut.fastq  ${SAMPLE}_R1_001.fastq ${SAMPLE}_R2_001.fastq**

done

I've done this a few ways (keeping untrimmed sequences, not keeping them, etc) - and for whatever reason, I am keeping only 0.2% of my reads (will attach picture of results).

I ran the sequences through FastQC and every single metagenome is great quality - so I am not sure why I am losing so many reads!

Does anyone know why this could be happening, and how to improve the reads I am keeping?

I was thinking of trying trimmomatic but I am not sure how to go about using custom adapters for trimmomatic.



Source link