Workflow as a loop - Bash or command line


I need to run 3 workflow like in a loop like below :

worflow1 -> workflow2 -> workflow3 ->worflow1 -> workflow2 -> workflow3 -> worflow1 -> workflow2 -> workflow3 -> worflow1 -> workflow2 -> workflow3  .....

Also, I found some issue, but no solution works well.

I tried it with a java snippet like this link ( With this solution, there is some process problems. I have to close Knime process. It works with windows (with this command : Taskkill /F /IM knime.exe), but it is not the same behaviour with linux and does not work on linux (killall java or killall knime).

I tried it with a bash node, with command line like this link ( but it launch the graphical interface too. I accurate that the bash error message from bash is (sorry, it's in french) :

Execute failed: STDERR message:  Usage : killall [-Z CONTEXTE] [-u UTILISATEUR] [ -eIgiqrvw ] [ -SIGNAL ] NOM...
killall -l, --list
killall -V, --version
requiert une concordance parfaite des noms très longs
recherche en ignorant la casse
tue le groupe associé au programme au lieu de celui-ci
tue les programmes créés avant HEURE
tue les programmes créés après HEURE
demande une confirmation avant de tuer
affiche tous les noms de signaux connus
n'affiche pas les remarques
interprète NOM comme une expression régulière étendue
-s,--signal SIGNAL
envoie ce signal au lieu de SIGTERM
-u,--user UTILISATEUR ne tue que le(s) programme(s) utilisé(s) par UTILISATEUR
informe si le signal a été correctement envoyé
affiche les informations sur la version
attend que les programmes s'arrêtent
-Z,--context REGEXP ferme seulement le(s) processus ayant l'argument context
(doit précéder les autres arguments)

I think that the bash node can't kill his own process.

I tried with an external tool node, but this node is (is not) for an other goal.

Could somebody help me ?



The problem was that my workflow was embedded by the previous workflow.

So the workflow1 launch the workflow2 -> the workflow2 launch th workflow3

That mean that the "workflow1 process" contains the "workflow2 process"  and the workflow2 contains the "workflow3 process" .  So, if I kill the process workflow1 before the workflow2, there is error.

So I made only one workflow with my 3 workflow.

In a Java Snippet node, I catch the process id with this code :


String jvmName = ManagementFactory.getRuntimeMXBean().getName();

out_pid = jvmName.split("@")[0];

Then, I made a cronjob to schedule the next running time for my workflow :

// The code for the cronjob :

LocalDateTime date_kill =;
int kill_minute = date_kill.getMinute();
int kill_hour = date_kill.getHour();
int kill_day = date_kill.getDayOfMonth();
int kill_month = date_kill.getMonthValue();
String kill_command = "";
String kill_cron = "";

String write = "crontab mycron";
String remove = "rm mycron" ;
String write_remove = write + " ; " + remove ;


kill_cron = kill_minute + " " + kill_hour + " " + kill_day + " " + kill_month + " * ";
kill_cron = kill_cron + c_pid ;
kill_cron = "echo \""+kill_cron+ " ; ps --no-headers axk comm o pid,args | awk '$2 ~ \"/home/thie/knime-full_3.3.0/jre/bin/java\"{print $1}' | xargs kill -9"+"\" >> mycron" ;

kill_command = "crontab -l > mycron";
kill_command = kill_command + " ; " + kill_cron;
kill_command = kill_command + " ; " + write_remove ;

ProcessBuilder pb = new ProcessBuilder("bash","-c",command );
	Process p = pb.start();
	String s;
	// read from the process's combined stdout & stderr
	 BufferedReader stdout = new BufferedReader(new InputStreamReader(p.getInputStream()));
	while ((s = stdout.readLine()) != null) 
	out_ResultCode = p.waitFor();
catch (Exception ex) 

I notice that if I had an ending node, like "Scatter Plot, ...." the workflow kill his own process alone.

So I don't need to kill any process, the pocess is self-killed.

Hi Thierry,

Great that you were able to figure this out! And thank you for posting your solution, this will certainly be helpful for other users who want to achieve something similar.