I have a folder receives CSV file everyday for sales outputs, the files are being automated to be dropped in the folder.
I combine the files together to have one file contains the total sales using the Concatenate node, very simple.
What I need to do is to automate Knime to pull the CSV file from the specific folder when the file is dropped to be added to the combined file and direct it automatically to a PBI report?
I donât know of any way for the addition of a new file into a folder to trigger a workflow run. Are we talking about the desktop version or licensed KNIME Business Hub? Desktop workflows either need to be manually run or scheduled via windows in batch mode. Business hub can be triggered via API or scheduled.
Is this folder in a fileshare service? If so, perhaps a second native automation tool or service could monitor folder activity and send an API call to Business Hub?
So you had the desktop workflow running non-stop in a loop to check for updated files in a folder or database? Was it running in the background in batch mode? Could you otherwise use KNIME normally without trouble or too big of a resource hit?
From my perspective, it would have to be a pretty frequent, crucial and time sensitive updating issue for me to explore an option that nuclear. Seems like it would run up a server bill, or weigh down a pc. Although, I guess if it solved a serious issue you could just buy another pc and have it run tasks like this in batch modeâŚ
@iCFO , I also had/have a workflow that runs continuously in a loop. At the start of the loop it contains a Wait⌠node. This node waits for the creation of a specific file in a folder. this is my âsignal.txtâ file.
A remote process would execute elsewhere (actually a workflow on my desktop pc) and it would drop files into a network share folder on the VM. When it was done transferring, it would create the âsignal.txtâ file. This âwakes upâ the timer in the loop on the vm and off it would go. First job was to delete the âsignal.txtâ file and then do a list files/folders and carry out the required processing on them. At the other end of the processing in just before the loop end, I put in a small delay of about 2 minutes but only to reduce load on an API it was using if there were continuous changes going on. When it got back to the start of the loop, if the âsignal.txtâ file had reappeared in the meantime, off it would go again.
This was all in foreground and I was perfectly able to use KNIME on the vm to edit/run other workflows while this was going on, although I only had a couple of flows on that VM. Not a hugely powerful VM either with just 16GB memory. It was demands on memory that caused general sluggishness on the vm, rather than the loop as such.
Interesting⌠Would this require a VM if it was something that was run only on certain days (or for certain part of a day), or do you think it could just run on a second version of KNIME installed on the same pc? Never tried to run 2 KNIME installs at once. I may just test it out while working today.
A second version of KNIME on same pc (obviously with its own workspace) should be fine. That will probably have greater demands than the one copy sitting on the VM, so depends on the pc, and memory I guess, but in principle I donât see why not. Just make sure they donât both try to grab 10GB of memory theyâve both been told they can have on a 16GB machine or one of them might go âpopâ
edit: In terms of side-by-side running, on my non-work PC often now have KNIME 4.7.x and 5.1 running side by side as Iâve been trying to get to grips with the new one. For work Iâm still on 4.7 and might stay that way for a little while yet.
I run a dual pc setup (both with dual KNIME installs) with plenty of resources on each, so I will definitely give this a shot.
On a side note: I am running 5.1 for work and Nightly on my side zip install. There are a few daily quirks in 5.1 (like empty config windows, hub signin failure) that didnât arise until after my testing and transitioning workflows⌠I would definitely hold at 4.7x a little while longer while they dial it in. Nothing is major enough for me to hassle with downgrading on my end.