Issue during pre:processing "OUT_OF_ME+"

Hello,

I was checking the job status and it showed :
Current Status:
OUT_OF_ME+

Can someone explain why?
Thank you

This is the message from our server - it means OUT_OF_MEMORY, i.e. the job tried using more memory than it requested.

Hello, thank you for your feedback. What I need to do ?

Hello,

I reduced the number of my samples and I still get the same error… What can I do to solve it ?Is it possible to increase the memory?

This is not related to sample size, but the spectra themselves. You can either try to use the R package locally on a powerful server, or subscribe to our Pro version which gives much more memory allocation by default.

1 Like

Hello,
I am having the same issue. I had followed all the steps and my FASTQ files were each less than 100MB as mentioned, everything was fine, but it got stuck at sequence data processing (screen shot below). Is there any way I could finish the analysis? will making Zip files help?
Thanks

I have been having the same issue lately, even with different projects…

The answer is already provided in my post. The message simply means the system requires more memory than allocated. Able to upload files does not guarantee later operations, as we cannot estimate precisely how much memory will be required at run time.

I am closing this thread.