gravatar for embueno

2 hours ago by

Hi everyone,

I want to analyze WGBS data for differentially methylated snps using the package DSS which uses BSSeq to create an object that then is used for model fitting and statistical testing.

My main problem is that, after multiple attempts, the object and subsequent arrays used for the analysis are too computationally intensive for my lab server. Even after "splitting" the object into 6 smaller pieces of ~5 million loci the script will keep running for hours....more than 24 hrs.

I've read that using a delayedArray package along with block processing will do the trick. However, I'm quite lost on how to di this. Has anyone ever dealt with a similar issue when working with large datasets in R? If so, I'd appreciate some advice on how to move forward with this analysis.

Thanks!



Source link