Hi everyone,
Hope your weekend is going well. I am setting up GET Requests with Parallel Chunk Start and End and my test run last night errored out with timeout after 4 hours. I am now trying to set up Try (Data Ports) and Catch Errors (Data Ports). I searched around and found these resources.
What I am struggling is how to use Catch Error (Data Ports) to retry GET Request for that errored row for several times then logs the error and move on to the next one. And ideally I want to capture that error data table and email it to myself so that I can inspect those particular GET Requests that errored out. Maybe the resources Iām enquiring about no longer exists, then I can remove it from the source list. Google API Iām calling showed as much as 10% error rate, wow.
Mr. @armingrudd had this and it looks close but slightly different because I need to move to the next one.
EDIT: OK, I can use Variable to Table Column to the error output port to capture errors in a table? Then do I put those rows back into the loop??
Thank you very much and hope your weekend is going well.
Hi @armingrudd,
You are already doing what I was trying to do in your Get Request Plus, no? If I wrap your Get Request Plus in my workflow, thatās running multiple parallel chunks in multiples? I should just use yours and set the concurrency (which accomplishes what I was trying to do with below). Am I understanding this correctly? If so, it was a super great experience building workflow but I should be better at searching for solution, like your Get Request Plus.
@armingrudd, thank you. This makes a lot of sense.
Iām working to incorporate your Get Request Plus now. When it runs, I cannot seem to be able to stop itā¦ Cancel, reset, F9, it has to run thru all requests or I have to close the workflow file. Am I doing something incorrectly? Iām connecting Get Request Plus directly to the nodes that creates REST GET URLs. I donāt need to use loop nor Try Catch, correct? Iām not able to cancel the GET Request node inside the component either.
Disconnect as in deleting the line that connects to the previous node? It lets me select the line but hitting delete key doesnāt do anything.
Iām testing with just 20 urls and each call takes about 20 seconds to return json. Get Request Plus keeps running now. Normally 20 requests goes quick.
EDIT: It did finish and outputted the data correctly. ā¦testing some moreā¦ THANK YOU!!
Maybe Iām misunderstanding the Concurrency setting. Concurrency of 1 means itās just running one instance of the node (workflow component). 2 means 2 instances of the node or the workflow inside it simultanerously? What I was trying to achieve was to run multiple GET Requests to speed up the data collection. I tried to research for āConcurrencyā setting but couldnāt really find much. Inside the component node, I saw Recursive Loop but no Parallel Chunk Loop, which I thought would be needed to run multiple instances simultaneously. Iām running a test with 200 urls now and changing Concurrency doesnāt seem to speed up the whole collection process. Iām going to increase the Concurrency to maybe 5 and see. Iāll report back. Thank you Sir, @armingrudd.
EDIT: Parallel Chunk did this so may be put your Get Request Plus inside Parallel Chunks to handle re-tries and failed request output? Will test that.
Now, Iām trying to use the Missed output port to write out the missed rows to Excel. I added some incorrectly formatted urls so that the API will either fail or return error and while I can see failed rows in the Missed output ports inside the Parallel Chunk window, Iām seeing this error from Excel Writer. I assume itās because I am not able to execute the Create Data&Time Range node. My intention was for each instance of parallel GET Request Plus to write out Missed rows if any. How could I best write Missed row table to Excel? Thank you so much for your GET Request Plus node and help.
Hi @armingrudd,
I used Empty Table Switch and Iām saving Missed in Excel now. Now, Iām seeing some cases where the node is collecting ? (NULL?), making those come thru as regular Output rather than Missed. Should I do Rule Engine right after the Output port and send those back into GET REQUEST PLUS input?
Thank you so much.
I am feeding 7 records that will fail into this Parallel Chunk with Get Request Plus. I am trying to write Missed rows into Excel or table. I have 6 custom chunks in the setting. This setup only captures the Missed from the last row. Am I not getting the other 6 Missed because those are processed in the Parallels? Maybe slow but making progress!! Thank you!!