I know I asked this before but we did not get anywhere. I am trying to scrape a subscription site. The site uses multiple XHR requests with requests to populate tables. I can see the POST requests in the inspector. Is there a way to extract the JSON reply from these requests? RIght now I am looking at potentially using Fiddler or so to save them all while selenium is browing which is insane
do I understand correctly, that youād like to just ārecordā all the request content while interacting with a page, in order to later parse it separately?
I once had actually a roughly similar idea, but never really pursued implementing it. Let me give it another thought.
In the meantime any input how that feature would look like from your perspective (i.e. from a workflow perspective) will be appreciated.
Understand - thank you. Weāll evaluate if this is something which can be integrated into the nodes. Iāll keep this post updated if thereās any news!
Iāve built an internal prototype which is promising, and agree that this idea makes sense. However it will require some fundamental changes in the pluginās internals. Thus we will schedule this for the next major 5.x release.
At the moment my main focus is on some other hot projects, thus this will take some time - feel free to drop me a reminder email to mail@seleniumnodes.com and we might be able to arrange some pre-release access, once the upcoming Selenium Nodes v5 has matured a bit more.