I tried FeatureFinderMetabo with the tutorial data (2012_02_03_PStd_10_1.mzML, 2012_02_03_PStd_10_2.mzML, 2012_02_03_PStd_10_3.mzML, 2012_02_03_PStd_050_1.mzML, 2012_02_03_PStd_050_2.mzML, 2012_02_03_PStd_050_3.mzML) and optimized parameters and it takes only 3-5 minutes, even though each file is >200Mb. However when I try it with different experimental files(5 mzML files or 5 mzXML files with FileConverter node) which are ~70Mb each, it takes from 2 up to 4 hours. I understand that the quality and maybe the size of the data/files is important, but it seems a bit weird to have such a big difference in file size and then in processing time. I would expect that the smaller the files are the less time they take.
Did you check if your data is centroided and that there are no zero-intensity peaks?
Sometimes you need to so some cleanup.
If you want (and are allowed), you can send us/upload a test file and me and my colleague (more proficient with the internals of this tool) will have a look.
Thank you very much!
Yes they are publicly available (I downloaded them from MetaboLights database): https://drive.google.com/open?id=1f4z4_QNagyhqdnfM409otQRJbbZ5fKlz for 5 mzML files and https://drive.google.com/open?id=1qWsO8VeK_a0CH-3KEHKyu8Evrs7hWnju for 5 mzXML files.
With optimized parameters (as proposed in the OpenMS handout https://abibuilder.informatik.uni-tuebingen.de/archive/openms/Tutorials/Handout/master/handout.pdf ) it takes 2 hours for the 5 mzML files and 3 hours and 40 minutes for the 5 mzXML files. How could I check if they have zero intensities? You mean after FeatureFinderMetabo has processed them? Also, I opened them and they say they are centroided.