Feed collection / accumulation

Hi,

I am wondering if there is a way to collect RSS feeds like I do it in Thunderbird when Old feeds are not deleted when new feeds are fetched. RSS Feed reader replace the current one with the new one and I would like to keep them for future.

Best,
Mateen

Hey @mateenraj,

you could use the Table Writer node to write the current feeds to a file. For the next run of the RSS Feed Reader node, you use the Table Reader and concatenate the old feed data with the new feed data and overwrite the old file using the Table Reader.

Cheers Julian

2 Likes

Hey @julian.bunzel,

Thanks for your reply. I tried with the attached workflow but when I refresh the Feed Reader node, the Table is recreated and thus the rest of the workflow. I am not sure if I need to include a separate Feed Reader node for Old and New data. Please find attached my workflow and let me know what I am missing.

Best,
MateenOld&NewFeeds.knwf (41.6 KB)

Hey @mateenraj,

just use one Table Writer to save the latest outcome of the RSS Feed Reader to a file (so that you have a file to start). The Table Writer can then be deleted. Afterwards you add a Table Reader and a Concatenate node. Connect the Table Reader and the RSS Feed Reader to the Concatenate node and use a new Table Writer afterwards that updates the previously created file. You can also connect the RSS Feed Reader and the Table Reader via Flow Variable to ensure that the Table Reader is being re-executed in every run, so that it always reads the updated file.

I changed your workflow accordingly and added some annotations. I hope it helps.

Cheers,
Julian
Old&NewFeeds.knwf (19.2 KB)

2 Likes

Hey @julian.bunzel,

Thanks, it solves the problem partially but with this workflow one can’t aggregate Feeds for weeks / months. It concatenates feeds of two instances, what if I want to collect feeds with many instances ? Is there any function / node of Appending ? Theoretically one can Write files but this is not an efficient solution.

Best,
Mateen

Hey @mateenraj,

basically it always appends new feeds to the bottom of the table in every run and stores it in a file afterwards, so that the file will grow over time. It should also work to aggregate feeds for several weeks/month.
What do you mean exactly with many instances? Maybe I misunderstood the question.

Cheers,

Julian

Hey @julian.bunzel,

By instance, I meant the refreshing Feed Parser. Please find attached workflow where I tried to define the problem as Concatenate Node flushes out the old data once Feed Parser is refreshed and same data is being fed from Table Reader as well as Feed Parser.

Best,
MateenStocks! 3.knwf (33.4 KB)

Hey @mateenraj,

you can use a Table Writer node after the Concatenate node to overwrite the previous table file. So after each run of the workflow the new data from the Feed Reader will be concatenated with the old data from the Table Writer and written to the table. That’s how it works for me.

Best,
Julian

Hey mateenraj,

you don’t need the last Table Reader node. Make sure that the Table Writer after the Concatenate node writes to the extact table that your Table Reader (that one connected to the Concatenate node) wants to read. I can see that the Table Writer wants to write to LatestData and the Reader reads OldData, these two have to be the same. So when you re-run the workflow the Table Reader reads the concatenated data from the previous runs.

Best,
Julian

hey @julian.bunzel

Thanks, that worked. I only used Table Reader to see the content of the file, whether it’s working or not.

Best,
Mateen

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.