HttpRetriever limit data

I want to retrieve data from a database through the HttpRetriever node but I cannot get all the data: somehow Knime truncates the data. In the result column given by the HttpResultDataExtractor node, I get an Entities Totalresults=153521, while Knime is retrieving only 2000 entities (as a result of a pagination process, which has a limit of pagesize=2000, otherwise Knime would truncate in 100 entities)
Any idea of what could be happening and what could I do to fix this problem?

Could you elaborate what kind of source/API you're accessing?

The HttpRetriever has an optional "Maximum file size" limit, which cancels a request in case the result exceeds a defined size. But this settings is disabled by default.

In case the API requires paging and you want more results than being returned with one request, you will have to access subsequent paginated URLs (usually with an "offset" query parameter). You can automatize this process with the "loop" nodes.

So, most likely, it is not KNIME or the HttpRetriever which limits the data, but the (REST-)API which you're accessing.