Start a Workflow, e. g. a database query, after a certain time

Sorry - which one do you mean?

CRASHED :expressionless:

Next try :stuck_out_tongue_winking_eye:, again via the WAIT node. Have now built something about
the task scheduling of Windows, which Starts a batch file which creates
a simple text file and overwrites each time. I configured the WAIT node
in KNIME to wait for the modification of a file and then execute it. That worked.

I try to start the subsequent wokflow behind the WAIT node via a LOOP node. :thinking:

Stupid about the WAIT node is, that it is only executable once and cannot be controlled with it.

Or is there any way to control the WAIT-Node?


If anyone here has a bright idea I am very grateful for a tip.


My suggestion would still be to try and put the wait node within the loop and create the times of a day you want it to be executed in a table to control when the loop would continue.

You might want to download the example and maybe try to adapt it so we could try and discuss what you would need.

You could of course also use a task manager to start a batch file or even start it at several times a day which in any case would need a computer to run continuously which also would be true if you just let the knime workflow wait with the loop.

If you use the batch mode it might be necessary to use a different knime-workspace if you want to use the machine at the same time.

The knime server in a dedicated server would be another option of course.

First @mlauber71: Thank you !

My first question about your case workflow would be:
What is the function of add_seconds?

I’m trying to figure out all the functions of your workflow…


@USCHUKN1ME I tried to build a system you might be able to expand and use.

The idea is to have a workflow that would iterate a number of times (v_number_of_intervals) with a set interval (of seconds) between (v_seconds_interval) them. The workflow looks for a trigger_file.csv if this file has changed from the last time the loop has run a component would be executed. If the file has not changed nothing would happen.

Admittedly not the most elegant way but you could build a small automatic execution machine with this as long as your workflow does run.

You can monitor the effect. Every time the trigger file changes the component gets executed. Here there are 10 iterations. At the second iteration the component executes, then nothing happens and then at the iterations 7, 8 and 9 the file has changed and the component executes again (the changes are simulated in the m_001 workflow - in a real live scenario they would occur by outside factors).

This might serve as an example how such a thing could work. Obviously it would need some work and one can debate if this is the best way to go forward. I would recommend using the KNIME server instead :slight_smile:

(please download the complete workflow group)

This job m_001 is there to prepare the setting and simulate the changing tigger_file.csv

You have three components:

  • trigger_file.csv - a simple CSV file that would change over time (a new delivery might be send to a folder)
  • trigger_meta_list - a small database that would store all the scans of the trigger_file.csv so you could see if something has changed
  • sample_00 - a dummy database in the same H2 database where you would either do something (if the trigger file is newer= or not


:grinning: :+1: wow - thanks a lot @mlauber71 .
I´ll trying to follow your workflow.

About KNIME-Server:
Kosten für einen KNIME-Server :thinking:
In my opinion, an Investment of 12,500 euros for a workflow is
somewhat disproportionate.


1 Like

@USCHUKN1ME this very much depends on your business model and what you want to do with it. The KNIME server is much more than one workflow :slight_smile: - I have a collection about what is possible there:

Also you can use KNIME server with AWS and just pay what you use :slight_smile:

And maybe if you contact the KNIME team they might help you with a test license so you can see for yourself …

1 Like

I know I’m very late to the party and maybe given all the replies after this question my reply might not be needed anymore. Still I think it fits to this question. About “times loops” please look at the vernalis community extensions. Yes, they are in the cheminformatics/life science space but also have some general use nodes like timed loops:


Maybe these can be helpful for your workflow or workflows in the future.


Hello @kienerj ,

thank you.

You are not too late :slightly_smiling_face:

Chunk Loop Run-to-time Loop Start(A)

Chunk Loop Run-to-time Loop Start(B)

Those are two interesting loop nodes. Which TN take a closer look at.

First, though, l’ll take a closer look at @mlauber71 Suggestion, as it’s
interesting as well.


1 Like

You could absolutely combine the approaches, I would think.

Thank you @kienerj for pointing to this feature.

Another interesting function could be this component by @gab1one that would wait for a file to change. In order to bring this into a ‘looping system’ - some changes might be necessary. In my example I tried to do a small such system with generic KNIME nodes.

@kienerj which Loop-End-Node is the right one for this two Loops?

Have to figure that out yourself. Only really used one of the timed-nodes once. My comment bascially was: I know these exist, have a look maybe they help with your problem.

1 Like

Hello again,

I have built a workflow using @mlauber71 example for the WAIT NODE. I disconnected temporary the WAIT-NODE which waits for a file change (batch file / Windows task scheduling). The new workflow works, which means the DB query NODE restarts every time. Everything works so far, except for the output of the dashboard which is generated by the component node. I used to save the dashboard graphic manually as a *. html file ona Server drive after Execute & Open the view, which the employees can
then access. I didn’t find anything in the dashboard configuration where you can save this automatically.

With the New WAIT-NODE-Workflow

The open Component-Node

The Dasboard-Configuration in the Component-Node

Hello @mlauber71 ,

i´ll come back to your hint to use the Amazon AWS Service:

:thinking: :flushed:

2.5K Euro per hour is too expensive for only one workflow.

@USCHUKN1ME it is more like 2,574 EUR in german notation. More like two and a half Euros per hour …

And the actual number very much depends on how powerful the AWS server would be, if shared ressources would be OK for you (not shared data of course) or if you would need your own dedicated super-CPU with super-fast SSD …

Ahhh - correct :see_no_evil: :sweat_smile: - sorry for that mistake

1 Like

No problem. The question is always: is you business making as much? And if KNIME significantly helps your business some Euros might be OK …

We are at the beginning.
And the workflow described here is almost the first where automation is needed - without having a dient PC (my) constantly switched on.

1 Like

" Also you can use KNIME server with AWS and just pay what you use :slight_smile: "

Hello again @mlauber71 ,

is there an instruction on how to implement KNIME workflow in AWS in
the KNIME forum?

And i guess you have to use KNIME-Server too on AWS?

Or did you mean this way:

KNIME Executors are available on the AWS Marketplace as bring-your-own-license (BYOL) and pay-as-you-go (PAYG) allowing you to automatically and dynamically start up new Executors.