Hi @miguel_silva3 , thanks for clarifying the example, and I breathed a sigh of relief at your response. 
I don’t think it’s a trivial problem in the sense that I cannot immediately think of a way to achieve it just using no-code nodes. Possibly a recursive loop, but that might impact performance across a large number of rows.
I believe I have a solution though involving a small piece of code which I’ve bundled into a component for ease of use.
If you’d like to give this a try, it will hopefully assist for your use case:
You configure it, specifying the grouping columns and then maximum number of rows per chunk. It then generates a “chunked grouping identifier” which you can use as the group identifier in a Group Loop…
So the workflow takes this form:
Feel free to open the component to look to see how it works. Most of the nodes there are to make it work dynamically (i.e. allow you to specify the grouping columns etc). The heart of it (the piece which actually generates the grouping identifier) is a java snippet.
A brief summary of what the component does is: it groups the rows to get a count of the rows for each grouping (eg. ID+Record). Then (and this is the java snippet), it starts at the first row, and sums the number of rows per grouping. If the cumulative sum to that point exceeds the maximum rows per chunk, it adds 1 to the grouped-chunk-id for the current grouping and then resets the cumulative count for the current group.
In this way, it assigns a “chunk identifier” that each group gets assigned to, and this can then be subsequently used in a group loop.
Obviously this needs testing, so please give it a go, and if you find any problems with it let me know.
btw, for the additional sample data that I asked about, it returns this:
which I believe matches your expectation of the final row being in its own group.
If you set the chunk size too low, the component will fail with the following message. In the following example I set the chunk size to 1 just to demo, and here it is saying that the group 1-A has 2 rows, but the maximum chunk size is 1, which of course means it cannot continue: