Hello Biostars community,
I have several problems when running some PoPoolation2 scripts in my university's clusters, problems that I haven't seen in any of the older posts here or at the PoPoolation2 webpage help forum, and I was hoping you could help me solve them.
For the cmh-test.pl, I got this error message: "line 89: 45909 Segmentation fault (core dumped)". The script still runs for 4hr in the cluster but the final output file has 0 bytes of data. This was run in a node with 512GB of memory (same problem when run in a cluster with 64GB of memory).
For fst-sliding.pl (for each SNP), I got this error message: "line 89: 34785 Segmentation fault". But the script still runs for 22hrs, and I get an output file of only 24kb. Seems too small of a file. This was run in a node with 64GB of memory.
For fst-sliding.pl (for a window-size of 2000), I got this error message: "line 89: 26193 Segmentation fault". The script still runs for 6hrs, and I get an output file of 72MB, but the table's full of "n.a." for each population comparison at each window. This was run in a node with 64GB of memory.
For snp-frequency-diff.pl, I got this error message: "line 89: 37710 Segmentation fault ". The script still runs for 8hr, and I got a pwc and a rc files of 12kb each. The output files seems too small. This was run in a node with 64GB of memory.
Does someone have any idea of what this segmentation default mean? I know it has to do with memory allocation to the scripts (or to some functions in the scripts), but haven't been able to isolate the problem since I don't know Perl enough. I took a look at the lines indicated in the error messages inside the Perl scripts, but I don't know enough to figure out what's going on. Could this be just a problem with my university's clusters? I will be talking to them as well, but if any of you have any comments, advice, or insight into this, please, let me know.
Thank you very much for your help!