How to merge "Chat Model Prompter" output with initial data

Hello Everyone,

I am trying to use the LLM for a literature review.

In the screened workflow I am looping throught various abstract texts and ask the LLM some questions.

The output of the Chat Model Prompter is like this:

I need to merge the abstract and the ai LLM reply with the initial dataset I providede the LLM.

How to do so?

I was able to manage but I guess there has to be a better way.

  1. Output of Chat

  2. Filtering only the origital text passed to the prompt with Row Filter when Role equal to “human” and “ai”

  3. Lag Column to have the LLM answer and the original data in the same row

  4. Roww filter to remove duplicate and obtain what I needed, that is the LLM model answer next to the text that was passed to the prompt

Please if there is a better way, let me know

1 Like

I think pivot node may help here:
Group column = Iteration
Pivot column = Role
In manual aggregation add column message with aggregation method first.

Think this works best if you remove system messages using row filter as you have done. You should have then one row per iteration with a column for human and ai containing text

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.