Hello and Good Morning,
I am presently undertaking a Proof of Concept workflow which is to be used by our Accounts Payable staff who during a month receive a Vendor soft copy invoice in an XLS file. We have several vendors that send in these soft copy invoice files. These files are to be manipulated into a format suitable to be imported to our ERP system.
My PoC Knime workflow uses some Quick forms to capture input variables ( Invoice Number, Invoice Date, Vendor), completes a File Import of the XLS, and passes the file and variables through to a IF switch where I evaluate if the file is for Vendor 1 (true) or not (false). If the IF Switch evaluates as true, the file continues the workflow path to be manipulated per the requirements of Vendor 1 's file layout. If the IF Switch evaluates as false… the file and variables pass to another IF switch where I evaluate if the file is for Vendor 2 (true) or not (false)… etc etc . At present I output to a report, but eventually it will be a DB Update, once I get past my current problem.
When executing this PoC workflow locally in Knime it works completely and correctly, as expected. When operating the workflow in the web portal it does not. I have established (by using a test break point) that my suite of Quickform forms (which are wrapped in a single metanode) is completing and passing all of its variables and the file out of the wrapped metanode and to the IF Switch. I am not receiving any ‘show workflow message’ to help me further diagnose the problem. So, I am turning to the Knime community for help in sourceing a solution.
Could you share more information about your situation, please. In particular:
what is your Analytics platform version?
what version of KNIME Server do you use?
would it be possible to share a minimal example to reproduce the problem? Of course, it should not contain any private data and authentication information. But some mock-up or public data would be helpful to execute it and debug the problem.
did you look into the Job View to see what problem occurs? For this you would need to have KNIME Remote Workflow Editor extension installed in the local Analytics Platform and KNIME Remote Workflow Editor for Executor in the executor on the server side. For more details, see the sections on “Job Preview” in this document for KNIME Server 4.8.
is there a chance to share mock-up data (not the real data) that would represent the file structure, but would have dummy contents? I would need it to be able to run the workflow to start with debugging. Guessing about the table format makes it hard to make the workflow work. Alternatively, could you prepare a minimal example that would reproduce the problem? For example, it could use a basic table with two columns
column1 | column2
a | 1
b | 2
And you would need to simplify the rule engines accordingly.
A side node, when you execute the workflow on the WebPortal, do you explicitly upload a file or do you run with the default value? In the former case- all good and the following is not relevant. But in the latter case, is the default file accessible from the machine, where the Server executor is running? I see that you use some absolute path in the upload-file default value. Two possible issues pop up in my mind:
this partition might be not mounted on the server executor machine
the server executor might use a different operating system. For example. you seemed to use Windows path separator "\", but linux and MacOS use "/" instead, to my knowledge.
Good Day Mischa,
Attached is my ‘invoice’, this sample will pass through both path 1 or path 2 of the workflow depending which workflow variable (Quickform) is chosen.
To answer your question, yes, the XLS file is explicitly uploaded, this is intentional as I do not wish the data operator to have to be bothered with renaming XLS files. They should be able to import the vendor XLS, select which vendor it is from, add an Invoice Number, Invoice Date, and select the workflow path. In this proof of concept I have been concentrating on using Adams or Chandler as my vendor/workflow.
thanks for the data example and for motivation for design decisions.
Could you try this modified version of your workflow: Bulk Invoice Load Testing (Modified).knwf (41.2 KB) It woks for me both locally using Analytics Platform 3.7.2 and on a 4.8.2 server. I modified it slightly. You will need to put in a path to an existing data file to make it work (I used a workflow-relative path to a copy of your data inside, but i removed it before export).
The biggest thing that i have changed was to use Case Switch Variable (Start) instead of the If switches from Vernalis. My server did not have the relevant extension installed, so i couldn’t use the workflow is it was. Could it be also the reason for the problems that you have experienced? What kind of error/failing behaviour did you see?
Otherwise i have removed several flow variable connections. Flow variables propagate along data links, so you do not need to explicitly set a flow variable connections, if there is a data connection between nodes.
Thank you Mischa,
I have saved this locally and taken a look. On immediate inspection it is working for me , so I will continue to review, and load to our sever for testing as well. I shall report back later during the morning on the progress.
On a related note, whilst my ‘proof of concept’ only incorporated 2 vendors, and 2 paths for file manipulation of the vendor file, I had intended to make this entire workflow suitable for about 10 vendors, in which case I will require 10 paths for file manipulation. My original use of the IF node was to nest (or water-fall) the IF node so it evaluated which path to eventually use, it is not completely clear to me if the CASE switch you have used will allow nesting. I shall experiment and try the CASE switch out, but I had thought I would ask anyway.
Very much appreciated,
Brett
The way with the case switches is that you can cascade them in some way, for example:
Then you would need not a single but a set of flow variables to drive through a particular path in the case switches (for example, var_switch1, var_switch2,…). Alternatively, you can connect 3 switches to the first one in the chain and this way you need only 2 variables to route through them, and gives you 3*3=9 pathes already. Add one more layer of switches and you get 3*3*3=27. I guess, you see the pattern already
Thank you Mischa,
I shall work with your solution through to completion, it certainly seems like it will do the job more effectively than the original IF switch that I had intended on using.