Hi,
I have a table with a lot of URLs (roughly 2800) referring to notices on TED portal (Place of performance (Map) - TED Tenders Electronic Daily).
I want to save the source code of each URL. However, after 500 I receive an error since too many requests were sent and I can try in 24 hours again.
I noticed that having a different IP-address for the next 500 URLs can bypass this problem, but I am not sure how and if this can be implemented and if there are other possible ways to bypass this error.
Thx and kind regards