Execute workflow from a third-part application: "All ways" and "How"

commandline
batch
#1

Hi There,

I work for a company that uses the KNIME server, with the need to schedule workflows every minute, in order to compute values in near real time.

I have not this need, and I’m searching for “something” that could execute workflows without open KNIME Analytic Platform.
In other words, how can I execute a workflow using a simple third-part application (for Dummies)?

My intent is to build a simple application (a white screen with 1 push-button that the final user have to push) that can play a workflow, and finally shows the workflow results.

What are all ways that I could follow, using ONLY the free Analytics Platform?

  • command line workflow execution (batch mode, right? some information?)
  • external execution (via REST, right? some information?) REST
  • ??

Thanks in advance,
Andrea

0 Likes

#2

As you say one option to run in batch mode, I suspect you’d need to get the workflow to write a file you application then reads once the batch process has finished. The KNIME FAQ has some instructions for running in batch mode.

You could also use the python integration (which I believe runs KNIME in batch mode behind the scenes). Details here: https://www.knime.com/blog/knime-and-jupyter

You won’t be able to run workflows as REST services without a KNIME server.

Cheers

Sam

0 Likes

#3

Hi swebb, thanks a lot for the answer!

So, as I’ve understood, the ONLY (free) way is the batch mode (“direct”, or by python integration), right?

Are there any other 3rd part applications (without create it from scratch) to command the execution of a workflow (only using the KNIME Analytic Platform)?
E.g.

I build a workflow that reads data from tabel1 in db1, it computes something and it writes the results in tabel2 in db2.
From a 3rd part application (for example PowerBI) I say to KNIME to execute the workflow and to read data from tabel2 in db2 in order to visualize results.

Is it possible?

Sorry for all these questions, but I have to report the answer to by boss! :smiley:

Thanks,
Andrea

0 Likes

#4

As far as I’m aware those are your two options.

I’m not aware of any third party applications to manage batch processing.

build a workflow that reads data from tabel1 in db1, it computes something and it writes the results in tabel2 in db2.
From a 3rd part application (for example PowerBI) I say to KNIME to execute the workflow and to read data from tabel2 in db2 in order to visualize results.

Yes, this is the kind of thing I way implying with my suggestion of getting KNIME to write a file you then read but using a database instead.

Really, this is the kind of activity you want to be using the KNIME server for.

Cheers

Sam

0 Likes

#5

There are many software packages that manage the execution of jobs on 1 or more machine (e.g PBS, SLURM, …)

That being said, aside from the administration of the controller and runner machines, you’d still need to write the wrapper script that invokes knime -nosplash -application ... (probably followed by some kind of script which process the output logs to check for an error, and perhaps emails certain recipients - if it’s a long running job.)

All of which is KNIME Server, by any other name.

1 Like