Hello! I am using kafka consumer in order to get data from my local kafka server.
The problem is that it terminates before the ‘stop criterion’ which is set to ‘2021-12-31’ before it doesn’t receive new messages (please see picture). While, I want the node to run continuously in order to conduct real time event processing.
So, how can I make the node consume data until ‘2021-12-31’? Namely, how can the kafka consumer node run continuously?
Hi,
does it stop with an error or just returns a couple of messages? Is it always the same number of messages that are returned, or does it vary?
Kind regards,
Alexander
I have created a test topic with only 8 messages. The connector stops when it consumes all messages. It doesn’t show any error.
What I want is the Knime connection to kafka to be always open, in order to conduct real time stream processing. Is that possible?
From what I’ve seen so far, the Kafka Consumer terminates when it consumes all messages, although I have configured the ‘stop criterion’ to be ‘2021-12-31’, but it stops earlier.
Hi,
It’s not about the server itself, but the stream. How are you writing the 8 items into it? Maybe the producer closes the stream after writing the items and by doing that tells KNIME that there will never be another item.
Kind regards,
Alexander
I am writing the messages via the Kafka Producer CLI. And the stream closes only with Ctrl+C, but I nerver do it. Therefore the producer is always open.
Hi,
I had a look into the node description and it says:
Stop when message timestamp exceeds Consumes all messages created up until the selected date & time. If the selected date & time is in the future, the execution stops once a poll request returns no messages.
So it seems like no matter the setting, when a poll does not return any messages sent in the last poll timeout range, the node terminates. Maybe you can use the node in a loop to make it work the way you want it?
Kind regards,
Alexander