-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Segmentation fault (core dumped) #26
Comments
Hi, from what I see from the command you are trying to use a compressed fastq file, which RATTLE doesn't support as of now. You will need to uncompress it first. Also, be sure to filter out small reads (we usually filter out those smaller than 150bp). Best, |
Hi! Yes, please send me the fastq file if that's possible to ivan.delarubia@upf.edu |
I have CPU 128 with 1T memory, but I still can not run a command. Could you help me? |
Hi there, Do you run into any problems when using RATTLE with the example toyset dataset? Can you please check whether your reads contain any invalid bases? RATTLE could run into this issue when generating kmers with reads containing invalid bases. Hope this helps, |
I encountered no issues while utilizing RATTLE with the toyset dataset as an example. To eliminate low-quality bases, I employed fastp. Nevertheless, both the initial fastq file (20G) and the trim.fastq file (12G) encountered identical problems during the application of RATTLE. Additionally, upon counting the bases in my fastq file, I did not observed the presence of the "N" base. Could you help me check this issue? :) |
Hi, Valid bases are A,T,C,G,U. All other bases in reads are considered as invalid, including 'N'. No need to worry about 'N', RATTLE will filter it out. Could you please provide your RATTLE command? If it is possible, can you please run RATTLE with your dataset with '--verbose' flag and provide the progress bar? This could provide me more information and identify why and where RATTLE went wrong. Thanks, |
Do you have very short or extremely long reads in your input?
E.
…On Wed, 7 Jun 2023 at 12:51, Wu Ziwei ***@***.***> wrote:
Hi there,
Do you run into any problems when using RATTLE with the example toyset
dataset? Can you please check whether your reads contain any invalid bases?
RATTLE could run into this issue when generating kmers with reads
containing invalid bases.
Hope this helps, Eileen
I encountered no issues while utilizing RATTLE with the toyset dataset as
an example. To eliminate low-quality bases, I employed fastp. Nevertheless,
both the initial fastq file (20G) and the trim.fastq file (12G) encountered
identical problems during the application of RATTLE. Additionally, upon
counting the bases in my fastq file, I did not observed the presence of the
"N" base. Could you help me check this issue? :)
—
Reply to this email directly, view it on GitHub
<#26 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADCZKBYQE3BREWONVKUBFRDXJ7UBTANCNFSM46HK3SQQ>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Yes, you might be correct. I checked my fastq file and confirmed that it contains reads longer than 150. However, I neglected to determine the length of the longest reads. |
Facing this issue:==========================================
|
echo "Starting at $(date)" echo "Job name: ${SLURM_JOB_NAME}, Job ID: ${SLURM_JOB_ID}" echo "I have Navigate to Porechop directorycd /home/rkumar/Porechop || exit Define input and output pathsinput_file="/scratch/g/........./Nanopore_cDNA/A3HE/A3HE.fastq" output_folder="/scratch/g/...../Nanopore_cDNA/A3HE/Rattle_A3HE" Check if input file existsif [ ! -f "$input_file" ]; then
fi Create output directory if it does not existmkdir -p "$output_folder/clusters" Step 1: Filter reads by length (if needed, adjust according to your data)filtered_file="${input_file%.fastq}_filtered.fastq" porechop -i "$input_file" -o "$filtered_file" --discard_middle --min_split_read_size 150 Check if filtered file was createdif [ ! -f "$filtered_file" ]; then
fi Navigate to Rattle directorycd /home/rkumar/RATTLE || exit Step 2: Run the RATTLE commands./rattle cluster -i "$filtered_file" -o "$output_folder" --rna -B 0.5 -b 0.3 -f 0.2 ./rattle cluster_summary -i "$filtered_file" -c "$output_folder/clusters.out" > "$output_folder/cluster_summary.tsv" ./rattle extract_clusters -i "$filtered_file" -c "$output_folder/clusters.out" -o "$output_folder/clusters" --fastq Step 3: Correct reads./rattle correct -i "$filtered_file" -c "$output_folder/clusters.out" -o "$output_folder" Step 4: Merge consensi files and run polishing stepconsensi_file="$output_folder/consensi.fq" cat "$output_folder"/*/consensi.fq > "$consensi_file" Check if consensi file was createdif [ ! -f "$consensi_file" ]; then
fi ./rattle polish -i "$consensi_file" -o "$output_folder" --rna echo "Finished at $(date)" Periodically log memory usagewhile true; do
done & |
Hi, thanks for developing the tool. As title, when running the rattle cluster it return Segmentation fault (core dumped).
This is the code.
and the output is
RNA mode: 1
Reading fasta file... Done
Segmentation fault (core dumped)
The input file is less than 500 thousands reads and the device is 16 cores/32 threads with 1T memory. From previous discussion, the limited memory might the problem but I think my input reads has much lower amount. Hope anyone could discuss about it.
The text was updated successfully, but these errors were encountered: